957 resultados para Statistics - Analysis
Resumo:
We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.
Resumo:
The South Carolina Department of Consumer Affairs publishes an annual mortgage log report as a requirement of the South Carolina Mortgage Lending Act, which became effective on January 1, 2010. The mortgage log report analyzes the following data, concerning all mortgage loan applications taken: the borrower’s credit score, term of the loan, annual percentage rate, type of rate, and appraised value of the property. The mortgage log report also analyzes data required by the Home Mortgage Disclosure Act, including the following information: the loan type, property type, purpose of the loan, owner/occupancy status, loan amount, action taken, reason for denial, property location, gross annual income, purchaser of the loan, rate spread, HOEPA status, and lien status as well as the applicant and co-applicant’s race, ethnicity, and gender.
Resumo:
Mestrado em Finanças
Resumo:
Background: Preterm labor, which defines as live-birth delivery before 37 weeks of gestation is a main determinant of neonatal morbidity and mortality around the world. Objective: The aim of this study was to determine the prevalence of preterm labor in Iran by a meta-analysis study, to be as a final measure for policy makers in this field. Materials and Methods: In this meta-analysis, the databases of Thomson database (Web of Knowledge), PubMed/Medline, Science Direct, Scopus, Google Scholar, Iranmedex, Scientific Information Database (SID), Magiran, and Medlib were searched for articles in English and Persian language published between 1995 and 2014. Among the studies with regard to the inclusion and exclusion criteria, 14 studies (out of 1370 publications) were selected. Data were analyzed by using Stata software version 11. The heterogeneity of reported prevalence among studies was evaluated by the Chi-square based Q test and I2 statistics. Results: The results of Chi-square based on Q test and I2 statistics revealed severe heterogeneity (Q=2505.12, p-value < 0.001 and I2= 99.5%) and consequently, the random effect model was used for the meta-analysis. Based on the random effect model, the overall estimated prevalence of preterm in Iran was 9.2% (95% CI: 7.6 – 10.7). Conclusion: Present study summarized the results of previous studies and provided a comprehensive view about the preterm delivery in Iran. In order to achieve a more desirable level and its reduction in the coming years, identifying affecting factor and interventional and preventive actions seem necessary.
Resumo:
The wave energy industry is entering a new phase of pre-commercial and commercial deployments of full-scale devices, so better understanding of seaway variability is critical to the successful operation of devices. The response of Wave Energy Converters to incident waves govern their operational performance and for many devices, this is highly dependent on spectral shape due to their resonant properties. Various methods of wave measurement are presented, along with analysis techniques and empirical models. Resource assessments, device performance predictions and monitoring of operational devices will often be based on summary statistics and assume a standard spectral shape such as Pierson-Moskowitz or JONSWAP. Furthermore, these are typically derived from the closest available wave data, frequently separated from the site on scales in the order of 1km. Therefore, variability of seaways from standard spectral shapes and spatial inconsistency between the measurement point and the device site will cause inaccuracies in the performance assessment. This thesis categorises time and frequency domain analysis techniques that can be used to identify changes in a sea state from record to record. Device specific issues such as dimensional scaling of sea states and power output are discussed along with potential differences that arise in estimated and actual output power of a WEC due to spectral shape variation. This is investigated using measured data from various phases of device development.
Resumo:
Cracks or checks in biscuits weaken the material and cause the product to break at low load levels that are perceived as injurious to product quality. In this work, the structural response of circular digestive biscuits, with diameter 72 mm and thickness 7.2 mm, simply supported around the circumference and loaded by a central concentrated force was investigated by experiment and theory. Tests were conducted to quantify the distribution in breakage strength for structurally sound biscuits, biscuits with natural checks and biscuits with a single known part-through crack. For sound biscuits the breakage force is Normally distributed with a mean of 12.5 N and standard deviation of 1.2 N. For biscuits with checks, the corresponding statistics are 9.6 N ± 2.62 N respectively. The presence of a crack weakens the biscuit and strength, as measured by breakage force falls almost linearly with crack length and crack depth. The orientation of the crack, whether radial or tangential, and its location (i.e. position of the crack mid-point on the biscuit surface) are also important. Deep, radial, cracks located close to the biscuit centre can reduce the strength by up to 50%. Two separate failure criteria were examined for sound and cracked biscuits respectively. The results from these tests were in good accord with theory. For a biscuit without defects, breakage occurred when maximum biscuit stress reached or exceeded the failure stress of 420 kPa. For a biscuit with cracks, breakage occurred as above or alternatively when its critical stress intensity factor of 18 kPam0.5 was reached.
Resumo:
This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.
Resumo:
Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.
Resumo:
This Powerpoint presentation gives statistics on property taxes throughout the state.
Resumo:
Abstract : This is a study concerning comparisons between the Dubovik Aerosol optical depth (AOD) retrievals from AEROCAN (ARONET) stations and AOD estimates from simulations provided by a chemical transport model (GEOS-Chem : Goddard Earth Observing System Chemistry). The AOD products associated with the Dubovik product are divided into total, fine and coarse mode components. The retrieval period is from January 2009 to January 2013 for 5 Arctic stations (Barrow, Alaska; Resolute Bay, Nunavut; 0PAL and PEARL (Eureka), Nunavut; and Thule, Greenland). We also employed AOD retrievals from 10 other mid-latitude Canadian stations for comparisons with the Arctic stations. The results of our investigation were submitted to Atmosphere-Ocean. To briefly summarize those results, the model generally but not always tended to underestimate the (monthly) averaged AOD and its components. We found that the subdivision into fine and coarse mode components could provide unique signatures of particular events (Asian dust) and that the means of characterizing the statistics (log-normal frequency distributions versus normal distributions) was an attribute that was common to both the retrievals and the model.
Resumo:
The world of Computational Biology and Bioinformatics presently integrates many different expertise, including computer science and electronic engineering. A major aim in Data Science is the development and tuning of specific computational approaches to interpret the complexity of Biology. Molecular biologists and medical doctors heavily rely on an interdisciplinary expert capable of understanding the biological background to apply algorithms for finding optimal solutions to their problems. With this problem-solving orientation, I was involved in two basic research fields: Cancer Genomics and Enzyme Proteomics. For this reason, what I developed and implemented can be considered a general effort to help data analysis both in Cancer Genomics and in Enzyme Proteomics, focusing on enzymes which catalyse all the biochemical reactions in cells. Specifically, as to Cancer Genomics I contributed to the characterization of intratumoral immune microenvironment in gastrointestinal stromal tumours (GISTs) correlating immune cell population levels with tumour subtypes. I was involved in the setup of strategies for the evaluation and standardization of different approaches for fusion transcript detection in sarcomas that can be applied in routine diagnostic. This was part of a coordinated effort of the Sarcoma working group of "Alleanza Contro il Cancro". As to Enzyme Proteomics, I generated a derived database collecting all the human proteins and enzymes which are known to be associated to genetic disease. I curated the data search in freely available databases such as PDB, UniProt, Humsavar, Clinvar and I was responsible of searching, updating, and handling the information content, and computing statistics. I also developed a web server, BENZ, which allows researchers to annotate an enzyme sequence with the corresponding Enzyme Commission number, the important feature fully describing the catalysed reaction. More to this, I greatly contributed to the characterization of the enzyme-genetic disease association, for a better classification of the metabolic genetic diseases.
Resumo:
The dynamics and geometry of the material inflowing and outflowing close to the supermassive black hole in active galactic nuclei are still uncertain. X-rays are the most suitable way to study the AGN innermost regions because of the Fe Kα emission line, a proxy of accretion, and Fe absorption lines produced by outflows. Winds are typically classified as Warm Absorbers (slow and mildly ionized) and Ultra Fast Outflows (fast and highly ionized). Transient Obscurers -optically thick winds that produce strong spectral hardening in X-rays, lasting from days to months- have been observed recently. Emission and absorption features vary on time-scales from hours to years, probing phenomena at different distances from the SMBH. In this work, we use time-resolved spectral analysis to investigate the accretion and ejection flows, to characterize them individually and search for correlations. We analyzed XMM-Newtomn data of a set of the brightest Seyfert 1 galaxies that went through an obscuration event: NGC 3783, NGC 3227, NGC 5548, and NGC 985. Our aim is to search for emission/absorption lines in short-duration spectra (∼ 10ks), to explore regions as close as the SMBH as the statistics allows for, and possibly catch transient phenomena. First we run a blind search to detect emission/absorption features, then we analyze their evolution with Residual Maps: we visualize simultaneously positive and negative residuals from the continuum in the time-energy plane, looking for patterns and relative time-scales. In NGC 3783 we were able to ascribe variations of the Fe Kα emission line to absorptions at the same energy due to clumps in the obscurer, whose presence is detected at >3σ, and to determine the size of the clumps. In NGC 3227 we detected a wind at ∼ 0.2c at ∼ 2σ, briefly appearing during an obscuration event.
Resumo:
Earthquake prediction is a complex task for scientists due to the rare occurrence of high-intensity earthquakes and their inaccessible depths. Despite this challenge, it is a priority to protect infrastructure, and populations living in areas of high seismic risk. Reliable forecasting requires comprehensive knowledge of seismic phenomena. In this thesis, the development, application, and comparison of both deterministic and probabilistic forecasting methods is shown. Regarding the deterministic approach, the implementation of an alarm-based method using the occurrence of strong (fore)shocks, widely felt by the population, as a precursor signal is described. This model is then applied for retrospective prediction of Italian earthquakes of magnitude M≥5.0,5.5,6.0, occurred in Italy from 1960 to 2020. Retrospective performance testing is carried out using tests and statistics specific to deterministic alarm-based models. Regarding probabilistic models, this thesis focuses mainly on the EEPAS and ETAS models. Although the EEPAS model has been previously applied and tested in some regions of the world, it has never been used for forecasting Italian earthquakes. In the thesis, the EEPAS model is used to retrospectively forecast Italian shallow earthquakes with a magnitude of M≥5.0 using new MATLAB software. The forecasting performance of the probabilistic models was compared to other models using CSEP binary tests. The EEPAS and ETAS models showed different characteristics for forecasting Italian earthquakes, with EEPAS performing better in the long-term and ETAS performing better in the short-term. The FORE model based on strong precursor quakes is compared to EEPAS and ETAS using an alarm-based deterministic approach. All models perform better than a random forecasting model, with ETAS and FORE models showing better performance. However, to fully evaluate forecasting performance, prospective tests should be conducted. The lack of objective tests for evaluating deterministic models and comparing them with probabilistic ones was a challenge faced during the study.
Resumo:
The Fourier transform-infrared (FT-IR) signature of dry samples of DNA and DNA-polypeptide complexes, as studied by IR microspectroscopy using a diamond attenuated total reflection (ATR) objective, has revealed important discriminatory characteristics relative to the PO2(-) vibrational stretchings. However, DNA IR marks that provide information on the sample's richness in hydrogen bonds have not been resolved in the spectral profiles obtained with this objective. Here we investigated the performance of an all reflecting objective (ARO) for analysis of the FT-IR signal of hydrogen bonds in DNA samples differing in base richness types (salmon testis vs calf thymus). The results obtained using the ARO indicate prominent band peaks at the spectral region representative of the vibration of nitrogenous base hydrogen bonds and of NH and NH2 groups. The band areas at this spectral region differ in agreement with the DNA base richness type when using the ARO. A peak assigned to adenine was more evident in the AT-rich salmon DNA using either the ARO or the ATR objective. It is concluded that, for the discrimination of DNA IR hydrogen bond vibrations associated with varying base type proportions, the use of an ARO is recommended.
Resumo:
The aim was to evaluate the relationship between orofacial function, dentofacial morphology, and bite force in young subjects. Three hundred and sixteen subjects were divided according to dentition stage (early, intermediate, and late mixed and permanent dentition). Orofacial function was screened using the Nordic Orofacial Test-Screening (NOT-S). Orthodontic treatment need, bite force, lateral and frontal craniofacial dimensions and presence of sleep bruxism were also assessed. The results were submitted to descriptive statistics, normality and correlation tests, analysis of variance, and multiple linear regression to test the relationship between NOT-S scores and the studied independent variables. The variance of NOT-S scores between groups was not significant. The evaluation of the variables that significantly contributed to NOT-S scores variation showed that age and presence of bruxism related to higher NOT-S total scores, while the increase in overbite measurement and presence of closed lip posture related to lower scores. Bite force did not show a significant relationship with scores of orofacial dysfunction. No significant correlations between craniofacial dimensions and NOT-S scores were observed. Age and sleep bruxism were related to higher NOT-S scores, while the increase in overbite measurement and closed lip posture contributed to lower scores of orofacial dysfunction.