67 resultados para Divida publica - Brasil - Modelos econometricos
Resumo:
In this work it was performed a study to obtain parameters for an 1D regional velocity model for the Borborema Province, NE Brazil. It was used earthquakes occurred between 2001 and 2013 with magnitude greater than 2.9 mb either from epicentres determined from local seismic networks or by back azimuth determination, when possible. We chose seven events which occurred in the main seismic areas in the Borborema Province. The selected events were recorded in up to 74 seismic stations from the following networks: RSISNE, INCT-ET, João Câmara – RN, São Rafael – RN, Caruaru - PE, São Caetano - PE, Castanhão - CE, Santana do Acarau - CE, Taipu – RN e Sobral – CE, and the RCBR (IRIS/USGS—GSN). For the determination of the model parameters were inverted via a travel-time table and its fit. These model parameters were compared with other known model (global and regional) and have improved the epicentral determination. This final set of parameters model, we called MBB is laterally homogeneous with an upper crust at 11,45 km depth and total crustal thickness of 33,9 km. The P-wave velocity in the upper crust was estimated at 6.0 km/s and 6.64 km/s for it lower part. The P-wave velocity in the upper mantle we estimated at 8.21 km/s with an VP/VS ratio of approximately 1.74.
Resumo:
A significant observational effort has been directed to investigate the nature of the so-called dark energy. In this dissertation we derive constraints on dark energy models using three different observable: measurements of the Hubble rate H(z) (compiled by Meng et al. in 2015.); distance modulus of 580 Supernovae Type Ia (Union catalog Compilation 2.1, 2011); and the observations of baryon acoustic oscilations (BAO) and the cosmic microwave background (CMB) by using the so-called CMB/BAO of six peaks of BAO (a peak determined through the Survey 6dFGS data, two through the SDSS and three through WiggleZ). The statistical analysis used was the method of the χ2 minimum (marginalized or minimized over h whenever possible) to link the cosmological parameter: m, ω and δω0. These tests were applied in two parameterization of the parameter ω of the equation of state of dark energy, p = ωρ (here, p is the pressure and ρ is the component of energy density). In one, ω is considered constant and less than -1/3, known as XCDM model; in the other the parameter of state equantion varies with the redshift, where we the call model GS. This last model is based on arguments that arise from the theory of cosmological inflation. For comparison it was also made the analysis of model CDM. Comparison of cosmological models with different observations lead to different optimal settings. Thus, to classify the observational viability of different theoretical models we use two criteria information, the Bayesian information criterion (BIC) and the Akaike information criteria (AIC). The Fisher matrix tool was incorporated into our testing to provide us with the uncertainty of the parameters of each theoretical model. We found that the complementarity of tests is necessary inorder we do not have degenerate parametric spaces. Making the minimization process we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are m = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059. Performing a marginalization we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are M = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059.
Resumo:
A significant observational effort has been directed to investigate the nature of the so-called dark energy. In this dissertation we derive constraints on dark energy models using three different observable: measurements of the Hubble rate H(z) (compiled by Meng et al. in 2015.); distance modulus of 580 Supernovae Type Ia (Union catalog Compilation 2.1, 2011); and the observations of baryon acoustic oscilations (BAO) and the cosmic microwave background (CMB) by using the so-called CMB/BAO of six peaks of BAO (a peak determined through the Survey 6dFGS data, two through the SDSS and three through WiggleZ). The statistical analysis used was the method of the χ2 minimum (marginalized or minimized over h whenever possible) to link the cosmological parameter: m, ω and δω0. These tests were applied in two parameterization of the parameter ω of the equation of state of dark energy, p = ωρ (here, p is the pressure and ρ is the component of energy density). In one, ω is considered constant and less than -1/3, known as XCDM model; in the other the parameter of state equantion varies with the redshift, where we the call model GS. This last model is based on arguments that arise from the theory of cosmological inflation. For comparison it was also made the analysis of model CDM. Comparison of cosmological models with different observations lead to different optimal settings. Thus, to classify the observational viability of different theoretical models we use two criteria information, the Bayesian information criterion (BIC) and the Akaike information criteria (AIC). The Fisher matrix tool was incorporated into our testing to provide us with the uncertainty of the parameters of each theoretical model. We found that the complementarity of tests is necessary inorder we do not have degenerate parametric spaces. Making the minimization process we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are m = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059. Performing a marginalization we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are M = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059.
Resumo:
Survival models deals with the modelling of time to event data. In certain situations, a share of the population can no longer be subjected to the event occurrence. In this context, the cure fraction models emerged. Among the models that incorporate a fraction of cured one of the most known is the promotion time model. In the present study we discuss hypothesis testing in the promotion time model with Weibull distribution for the failure times of susceptible individuals. Hypothesis testing in this model may be performed based on likelihood ratio, gradient, score or Wald statistics. The critical values are obtained from asymptotic approximations, which may result in size distortions in nite sample sizes. This study proposes bootstrap corrections to the aforementioned tests and Bartlett bootstrap to the likelihood ratio statistic in Weibull promotion time model. Using Monte Carlo simulations we compared the nite sample performances of the proposed corrections in contrast with the usual tests. The numerical evidence favors the proposed corrected tests. At the end of the work an empirical application is presented.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Chronic Hepatitis C is the leading cause of chronic liver disease in advanced final stage of hepatocellular carcinoma (HCC) and of death related to liver disease. Evolves progressively in time 20-30 years. Evolutionary rates vary depending on factors virus, host and behavior. This study evaluated the impact of hepatitis C on the lives of patients treated at a referral service in Hepatology of the University Hospital Onofre Lopes - Liver Study Group - from May 1995 to December 2013. A retrospective evaluation was performed on 10,304 records, in order to build a cohort of patients with hepatitis C, in which all individuals had their diagnosis confirmed by gold standard molecular biological test. Data were obtained directly from patient charts and recorded in an Excel spreadsheet, previously built, following an elaborate encoding with the study variables, which constitute individual data and prognostic factors defined in the literature in the progression of chronic hepatitis C. The Research Ethics Committee approved the project. The results were statistically analyzed with the Chi-square test and Fisher's exact used to verify the association between variable for the multivariate analysis, we used the Binomial Logistic regression method. For both tests, it was assumed significance p < 0.05 and 95%. The results showed that the prevalence of chronic hepatitis C in NEF was 4.96 %. The prevalence of cirrhosis due to hepatitis C was 13.7%. The prevalence of diabetes in patients with Hepatitis C was 8.78 % and diabetes in cirrhotic patients with hepatitis C 38.0 %. The prevalence of HCC was 5.45%. The clinical follow-up discontinuation rates were 67.5 %. The mortality in confirmed cases without cirrhosis was 4.10% and 32.1% in cirrhotic patients. The factors associated with the development of cirrhosis were genotype 1 (p = 0.0015) and bilirubin > 1.3 mg % (p = 0.0017). Factors associated with mortality were age over 35 years, abandon treatment, diabetes, insulin use, AST> 60 IU, ALT> 60 IU, high total bilirubin, extended TAP, INR high, low albumin, treatment withdrawal, cirrhosis and hepatocarcinoma. The occurrence of diabetes mellitus increased mortality of patients with hepatitis C in 6 times. Variables associated with the diagnosis of cirrhosis by us were blood donor (odds ratio 0.24, p = 0.044) and professional athlete (odds ratio 0.18, p = 0.35). It is reasonable to consider a revaluation in screening models for CHC currently proposed. The condition of cirrhosis and diabetes modifies the clinical course of patients with chronical hepatitis C, making it a disease more mortality. However, being a blood donor or professional athlete is a protective factor that reduces the risk of cirrhosis, independent of alcohol consumption. Public policies to better efficient access, hosting and resolution are needed for this population.
Resumo:
This study examines the factors that influence public managers in the adoption of advanced practices related to Information Security Management. This research used, as the basis of assertions, Security Standard ISO 27001:2005 and theoretical model based on TAM (Technology Acceptance Model) from Venkatesh and Davis (2000). The method adopted was field research of national scope with participation of eighty public administrators from states of Brazil, all of them managers and planners of state governments. The approach was quantitative and research methods were descriptive statistics, factor analysis and multiple linear regression for data analysis. The survey results showed correlation between the constructs of the TAM model (ease of use, perceptions of value, attitude and intention to use) and agreement with the assertions made in accordance with ISO 27001, showing that these factors influence the managers in adoption of such practices. On the other independent variables of the model (organizational profile, demographic profile and managers behavior) no significant correlation was identified with the assertions of the same standard, witch means the need for expansion researches using such constructs. It is hoped that this study may contribute positively to the progress on discussions about Information Security Management, Adoption of Safety Standards and Technology Acceptance Model