903 resultados para Tests for Continuous Lifetime Data
Resumo:
This paper tests the optimality of consumption decisions at the aggregate level taking into account popular deviations from the canonical constant-relative-risk-aversion (CRRA) utility function model-rule of thumb and habit. First, based on the critique in Carroll (2001) and Weber (2002) of the linearization and testing strategies using euler equations for consumption, we provide extensive empirical evidence of their inappropriateness - a drawback for standard rule- of-thumb tests. Second, we propose a novel approach to test for consumption optimality in this context: nonlinear estimation coupled with return aggregation, where rule-of-thumb behavior and habit are special cases of an all encompassing model. We estimated 48 euler equations using GMM. At the 5% level, we only rejected optimality twice out of 48 times. Moreover, out of 24 regressions, we found the rule-of-thumb parameter to be statistically significant only twice. Hence, lack of optimality in consumption decisions represent the exception, not the rule. Finally, we found the habit parameter to be statistically significant on four occasions out of 24.
Resumo:
Theories can be produced by individuals seeking a good reputation of knowledge. Hence, a significant question is how to test theories anticipating that they might have been produced by (potentially uninformed) experts who prefer their theories not to be rejected. If a theory that predicts exactly like the data generating process is not rejected with high probability then the test is said to not reject the truth. On the other hand, if a false expert, with no knowledge over the data generating process, can strategically select theories that will not be rejected then the test can be ignorantly passed. These tests have limited use because they cannot feasibly dismiss completely uninformed experts. Many tests proposed in the literature (e.g., calibration tests) can be ignorantly passed. Dekel and Feinberg (2006) introduced a class of tests that seemingly have some power of dismissing uninformed experts. We show that some tests from their class can also be ignorantly passed. One of those tests, however, does not reject the truth and cannot be ignorantly passed. Thus, this empirical test can dismiss false experts.We also show that a false reputation of knowledge can be strategically sustained for an arbitrary, but given, number of periods, no matted which test is used (provided that it does not reject the truth). However, false experts can be discredited, even with bounded data sets, if the domain of permissible theories is mildly restricted.
Resumo:
This paper provides new evidence on the determinants of the allocation of the US federal budget to the states and tests the capability of congressional, electoral and partisan theories to explain such allocation. We find that socio-economic characteristics are important explanatory variables but are not sufficient to explain the disparities in the distribution of federal monies. First, prestige committee membership is not conducive to pork-barrelling. We do not find any evidence that marginal states receive more funding; on the opposite, safe states tend to be rewarded. Also, states that are historically "swing" in presidential elections tend to receive more funds. Finally, we find strong evidence supporting partisan theories of budget allocation. States whose governor has the same political affiliation of the President receive more federal funds; while states whose representatives belong to a majority opposing the president party receive less funds.
Resumo:
Using the theoretical framework of Lettau and Ludvigson (2001), we perform an empirical investigation on how widespread is the predictability of cay {a modi ed consumption-wealth ratio { once we consider a set of important countries from a global perspective. We chose to work with the set of G7 countries, which represent more than 64% of net global wealth and 46% of global GDP at market exchange rates. We evaluate the forecasting performance of cay using a panel-data approach, since applying cointegration and other time-series techniques is now standard practice in the panel-data literature. Hence, we generalize Lettau and Ludvigson's tests for a panel of important countries. We employ macroeconomic and nancial quarterly data for the group of G7 countries, forming an unbalanced panel. For most countries, data is available from the early 1990s until 2014Q1, but for the U.S. economy it is available from 1981Q1 through 2014Q1. Results of an exhaustive empirical investigation are overwhelmingly in favor of the predictive power of cay in forecasting future stock returns and excess returns.
Resumo:
Online geographic-databases have been growing increasingly as they have become a crucial source of information for both social networks and safety-critical systems. Since the quality of such applications is largely related to the richness and completeness of their data, it becomes imperative to develop adaptable and persistent storage systems, able to make use of several sources of information as well as enabling the fastest possible response from them. This work will create a shared and extensible geographic model, able to retrieve and store information from the major spatial sources available. A geographic-based system also has very high requirements in terms of scalability, computational power and domain complexity, causing several difficulties for a traditional relational database as the number of results increases. NoSQL systems provide valuable advantages for this scenario, in particular graph databases which are capable of modeling vast amounts of inter-connected data while providing a very substantial increase of performance for several spatial requests, such as finding shortestpath routes and performing relationship lookups with high concurrency. In this work, we will analyze the current state of geographic information systems and develop a unified geographic model, named GeoPlace Explorer (GE). GE is able to import and store spatial data from several online sources at a symbolic level in both a relational and a graph databases, where several stress tests were performed in order to find the advantages and disadvantages of each database paradigm.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The treatment of wastewaters contaminated with oil is of great practical interest and it is fundamental in environmental issues. A relevant process, which has been studied on continuous treatment of contaminated water with oil, is the equipment denominated MDIF® (a mixer-settler based on phase inversion). An important variable during the operation of MDIF® is the water-solvent interface level in the separation section. The control of this level is essential both to avoid the dragging of the solvent during the water removal and improve the extraction efficiency of the oil by the solvent. The measurement of oil-water interface level (in line) is still a hard task. There are few sensors able to measure oil-water interface level in a reliable way. In the case of lab scale systems, there are no interface sensors with compatible dimensions. The objective of this work was to implement a level control system to the organic solvent/water interface level on the equipment MDIF®. The detection of the interface level is based on the acquisition and treatment of images obtained dynamically through a standard camera (webcam). The control strategy was developed to operate in feedback mode, where the level measure obtained by image detection is compared to the desired level and an action is taken on a control valve according to an implemented PID law. A control and data acquisition program was developed in Fortran to accomplish the following tasks: image acquisition; water-solvent interface identification; to perform decisions and send control signals; and to record data in files. Some experimental runs in open-loop were carried out using the MDIF® and random pulse disturbances were applied on the input variable (water outlet flow). The responses of interface level permitted the process identification by transfer models. From these models, the parameters for a PID controller were tuned by direct synthesis and tests in closed-loop were performed. Preliminary results for the feedback loop demonstrated that the sensor and the control strategy developed in this work were suitable for the control of organic solvent-water interface level
Resumo:
A tungsten carbide coating on the integrated platform of a transversely heated graphite atomizer was used as a modifier for the direct determination of Se in soil extracts by graphite furnace atomic absorption spectrometry. Diethylenetriaminepentaacetic acid (0.0050 mol L-1) plus ammonium hydrogencarbonate (1.0 mol L-1) extracted predominantly available inorganic selenate from soil. The formation of a large amount of carbonaceous residue inside the atomizer was avoided with a first pyrolysis step at 600 degreesC assisted by air during 30 s. For 20 muL of soil extracts delivered to the atomizer and calibration by matrix matching, an analytical curve (10.0-100 mug of L-1) with good linear correlation (r = 0.999) between integrated absorbance and analyte concentration was established. The characteristic mass was similar to63 pg of Se, and the lifetime of the tube was similar to750 firings. The limit of detection was 1.6 mug L-1, and the relative standard deviations (n = 12) were typically <4% for a soil extract containing 50 mug of L-1. The accuracy of the determination of Se was checked for soil samples by means of addition/recovery tests. Recovery data of Se added to four enriched soil samples varied from 80 to 90% and indicated an accurate method.
Resumo:
Neural networks and wavelet transform have been recently seen as attractive tools for developing eficient solutions for many real world problems in function approximation. Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. So, mathematical model is a very important tool to guarantee the development of the neural network area. In this article we will introduce one series of mathematical demonstrations that guarantee the wavelets properties for the PPS functions. As application, we will show the use of PPS-wavelets in pattern recognition problems of handwritten digit through function approximation techniques.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Many factors such as the sunlight, intensity of radiation, temperature, and moisture may influence the degradation process of geosynthetics. UV stabilizers are used especially in polyolefin geomembrane to prevent the degradation process. In these geomembranes the service lifetime is initially governed by the consumption of antioxidants. Tests like MFI and OIT are a alternative to detect the oxidative degradation in polyolefins. This article evaluates HDPE geomembrane degradation after UV exposure through the results of MFI and OIT tests. Two kinds of geomembranes were evaluated: a black and smooth (0.8, 1.0, 1.5, 2.5 mm) and a white and textured (1.0 mm). MFI test showed some levels of superficial degradation (crosslink) in HDPE geomembrane.
Resumo:
Hydraulic fracturing is an operation in which pressurised fluid is injected in the geological formation surrounding the producing well to create new permeable paths for hydrocarbons. The injection of such fluids in the reservoir induces seismic events. The measurement of this reservoir stimulation can be made by location these induced microseismic events. However, microseismic monitoring is an expensive operation because the acquisition and data interpretation system using in this monitoring rely on high signal-to-noise ratios (SNR). In general, the sensors are deployed in a monitoring well near the treated well and can make a microseismic monitoring quite an expensive operation. In this dissertation we propose the application of a new method for recording and location of microseismic events called nanoseismic monitoring (Joswig, 2006). In this new method, a continuous recording is performed and the interpreter can separate events from noise using sonograms. This new method also allows the location of seismic sources even when P and S phases onsets are not clear like in situations of 0 dB SNR. The clear technical advantage of this new method is also economically advantageous since the sensors can potentially be installed on the surface rather than in observation well. In this dissertation field tests with controlled sources were made. In the first test small explosives using fire works at 28 m (slant distances) were detected yealding magnitudes between -2.4 ≤ ML ≤ -1.6.. In a second test, we monitored perforation shots in a producing oil field. In this second test, one perforation shot was located with slant distances of 861 m and magnitude 2.4 ML. Data from the tests allow us to say that the method has potential to be used in the oil industry to monitor hydrofracture
Biocompatibility in vitro tests of mineral trioxide aggregate and regular and white Portland cements
Resumo:
Mineral trioxide aggregate (MTA) and Portland cement are being used in dentistry as root end-filling materials. However, biocompatibility data concerning genotoxicity and cytotoxicity are needed for complete risk assessment of these compounds. In the present study, genotoxic and cytotoxic effects of MTA and Portland cements were evaluated in vitro using the alkaline single cell gel (comet) assay and trypan blue exclusion test, respectively, on mouse lymphoma cells. The results demonstrated that the single cell gel (comet) assay failed to detect DNA damage after a treatment of cells by MTA and Portland cements for concentrations up to 1000 mu g/ml. Similarly, results showed that none of the compounds tested were cytotoxic. Taken together, these results seem to indicate that MTA and Portland cements are not genotoxins and do not induce cellular death.
Resumo:
Objectives: This study aimed to verify the dental caries prevalence in Baixo Guandu, the first Brazilian city to fluoridate its public water supplies; to compare the findings with the data from the national survey; and also to compare the prevalence in the 12-year-old age group with the data obtained before the beginning of the fluoridation. Methods: All the lifetime residents aged 5, 12, 15 to 19, and 35 to 44 years old were clinically examined (World Health Organization). Results: The means of dmft/DMFT were lower than in the Brazilian population living in fluoridated communities. The DMFT Index in 12-year-old residents decreased between 1953 and 2005 from 8.61 to 1.55. Conclusions: The addition of fluoride to public water supplies was an important ally in the improvement of the oral health of Baixo Guandu inhabitants.