917 resultados para ARCH and GARCH Models
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coexistence of sympatric species is mediated by resource partitioning. Pumas occur sympatrically with jaguars throughout most of the jaguar's range but few studies have investigated space partitioning between both species. Here, camera trapping and occupancy models accounting for imperfect detection were employed in a Bayesian framework to investigate space partitioning between the jaguar and puma in Emas National Park (ENP), central Brazil. Jaguars were estimated to occupy 54.1% and pumas 39.3% of the sample sites. Jaguar occupancy was negatively correlated with distance to water and positively correlated with the amount of dense habitat surrounding the camera trap. Puma occupancy only showed a weak negative correlation with distance to water and with jaguar presence. Both species were less often present at the same site than expected under independent distributions. Jaguars had a significantly higher detection probability at cameras on roads than at off-road locations. For pumas, detection was similar on and off-road. Results indicate that both differences in habitat use and active avoidance shape space partitioning between jaguars and pumas in ENP. Considering its size, the jaguar is likely the competitively dominant of the two species. Owing to its habitat preferences, suitable jaguar habitat outside the park is probably sparse. Consequently, the jaguar population is likely largely confined to the park, while the puma population is known to extend into ENP's surroundings. (C) 2011 Deutsche Gesellschaft fur Saugetierkunde. Published by Elsevier GmbH. All rights reserved.
Resumo:
The use of geoid models to estimate the Mean Dynamic Topography was stimulated with the launching of the GRACE satellite system, since its models present unprecedented precision and space-time resolution. In the present study, besides the DNSC08 mean sea level model, the following geoid models were used with the objective of computing the MDTs: EGM96, EIGEN-5C and EGM2008. In the method adopted, geostrophic currents for the South Atlantic were computed based on the MDTs. In this study it was found that the degree and order of the geoid models affect the determination of TDM and currents directly. The presence of noise in the MDT requires the use of efficient filtering techniques, such as the filter based on Singular Spectrum Analysis, which presents significant advantages in relation to conventional filters. Geostrophic currents resulting from geoid models were compared with the HYCOM hydrodynamic numerical model. In conclusion, results show that MDTs and respective geostrophic currents calculated with EIGEN-5C and EGM2008 models are similar to the results of the numerical model, especially regarding the main large scale features such as boundary currents and the retroflection at the Brazil-Malvinas Confluence.
Resumo:
Background: In addition to the oncogenic human papillomavirus (HPV), several cofactors are needed in cervical carcinogenesis, but whether the HPV covariates associated with incident i) CIN1 are different from those of incident ii) CIN2 and iii) CIN3 needs further assessment. Objectives: To gain further insights into the true biological differences between CIN1, CIN2 and CIN3, we assessed HPV covariates associated with incident CIN1, CIN2, and CIN3. Study Design and Methods: HPV covariates associated with progression to CIN1, CIN2 and CIN3 were analysed in the combined cohort of the NIS (n = 3,187) and LAMS study (n = 12,114), using competing-risks regression models (in panel data) for baseline HR-HPV-positive women (n = 1,105), who represent a sub-cohort of all 1,865 women prospectively followed-up in these two studies. Results: Altogether, 90 (4.8%), 39 (2.1%) and 14 (1.4%) cases progressed to CIN1, CIN2, and CIN3, respectively. Among these baseline HR-HPV-positive women, the risk profiles of incident GIN I, CIN2 and CIN3 were unique in that completely different HPV covariates were associated with progression to CIN1, CIN2 and CIN3, irrespective which categories (non-progression, CIN1, CIN2, CIN3 or all) were used as competing-risks events in univariate and multivariate models. Conclusions: These data confirm our previous analysis based on multinomial regression models implicating that distinct covariates of HR-HPV are associated with progression to CIN1, CIN2 and CIN3. This emphasises true biological differences between the three grades of GIN, which revisits the concept of combining CIN2 with CIN3 or with CIN1 in histological classification or used as a common end-point, e.g., in HPV vaccine trials.
Resumo:
The use of geoid models to estimate the Mean Dynamic Topography was stimulated with the launching of the GRACE satellite system, since its models present unprecedented precision and space-time resolution. In the present study, besides the DNSC08 mean sea level model, the following geoid models were used with the objective of computing the MDTs: EGM96, EIGEN-5C and EGM2008. In the method adopted, geostrophic currents for the South Atlantic were computed based on the MDTs. In this study it was found that the degree and order of the geoid models affect the determination of TDM and currents directly. The presence of noise in the MDT requires the use of efficient filtering techniques, such as the filter based on Singular Spectrum Analysis, which presents significant advantages in relation to conventional filters. Geostrophic currents resulting from geoid models were compared with the HYCOM hydrodynamic numerical model. In conclusion, results show that MDTs and respective geostrophic currents calculated with EIGEN-5C and EGM2008 models are similar to the results of the numerical model, especially regarding the main large scale features such as boundary currents and the retroflection at the Brazil-Malvinas Confluence.
Resumo:
The topic of my Ph.D. thesis is the finite element modeling of coseismic deformation imaged by DInSAR and GPS data. I developed a method to calculate synthetic Green functions with finite element models (FEMs) and then use linear inversion methods to determine the slip distribution on the fault plane. The method is applied to the 2009 L’Aquila Earthquake (Italy) and to the 2008 Wenchuan earthquake (China). I focus on the influence of rheological features of the earth's crust by implementing seismic tomographic data and the influence of topography by implementing Digital Elevation Models (DEM) layers on the FEMs. Results for the L’Aquila earthquake highlight the non-negligible influence of the medium structure: homogeneous and heterogeneous models show discrepancies up to 20% in the fault slip distribution values. Furthermore, in the heterogeneous models a new area of slip appears above the hypocenter. Regarding the 2008 Wenchuan earthquake, the very steep topographic relief of Longmen Shan Range is implemented in my FE model. A large number of DEM layers corresponding to East China is used to achieve the complete coverage of the FE model. My objective was to explore the influence of the topography on the retrieved coseismic slip distribution. The inversion results reveals significant differences between the flat and topographic model. Thus, the flat models frequently adopted are inappropriate to represent the earth surface topographic features and especially in the case of the 2008 Wenchuan earthquake.
Resumo:
Analyzing and modeling relationships between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects in chemical datasets is a challenging task for scientific researchers in the field of cheminformatics. Therefore, (Q)SAR model validation is essential to ensure future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to approve its use in real-world scenarios as an alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model is still under discussion. In this work, we empirically compare a k-fold cross-validation with external test set validation. The introduced workflow allows to apply the built and validated models to large amounts of unseen data, and to compare the performance of the different validation approaches. Our experimental results indicate that cross-validation produces (Q)SAR models with higher predictivity than external test set validation and reduces the variance of the results. Statistical validation is important to evaluate the performance of (Q)SAR models, but does not support the user in better understanding the properties of the model or the underlying correlations. We present the 3D molecular viewer CheS-Mapper (Chemical Space Mapper) that arranges compounds in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kinds of features, like structural fragments as well as quantitative chemical descriptors. Comprehensive functionalities including clustering, alignment of compounds according to their 3D structure, and feature highlighting aid the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. Even though visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allows for the investigation of model validation results are still lacking. We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. New functionalities in CheS-Mapper 2.0 facilitate the analysis of (Q)SAR information and allow the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. Our approach reveals if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org.
Resumo:
The Scilla rock avalanche occurred on 6 February 1783 along the coast of the Calabria region (southern Italy), close to the Messina Strait. It was triggered by a mainshock of the Terremoto delle Calabrie seismic sequence, and it induced a tsunami wave responsible for more than 1500 casualties along the neighboring Marina Grande beach. The main goal of this work is the application of semi-analtycal and numerical models to simulate this event. The first one is a MATLAB code expressly created for this work that solves the equations of motion for sliding particles on a two-dimensional surface through a fourth-order Runge-Kutta method. The second one is a code developed by the Tsunami Research Team of the Department of Physics and Astronomy (DIFA) of the Bologna University that describes a slide as a chain of blocks able to interact while sliding down over a slope and adopts a Lagrangian point of view. A wide description of landslide phenomena and in particular of landslides induced by earthquakes and with tsunamigenic potential is proposed in the first part of the work. Subsequently, the physical and mathematical background is presented; in particular, a detailed study on derivatives discratization is provided. Later on, a description of the dynamics of a point-mass sliding on a surface is proposed together with several applications of numerical and analytical models over ideal topographies. In the last part, the dynamics of points sliding on a surface and interacting with each other is proposed. Similarly, different application on an ideal topography are shown. Finally, the applications on the 1783 Scilla event are shown and discussed.
Resumo:
Vascular surgeons perform numerous highly sophisticated and delicate procedures. Due to restrictions in training time and the advent of endovascular techniques, new concepts including alternative environments for training and assessment of surgical skills are required. Over the past decade, training on simulators and synthetic models has become more sophisticated and lifelike. This study was designed to evaluate the impact of a 3-day intense training course in open vascular surgery on both specific and global vascular surgical skills.
Resumo:
Most criticism about homeopathy concerns the lack of a scientific basis and theoretical models. In order to be accepted as a valid part of medical practice, a wellstructured research strategy for homeopathy is needed. This is often hampered by methodological problems as well as by gross underinvestment in the required academic resources. Fundamental research could make important contributions to our understanding of the homeopathic and high dilutions mechanisms of action. Since the pioneering works of Kolisko on wheat germination (Kolisko, 1923) and Junker on growth of microorganisms (paramecium, yeast, fungi) (Junker, 1928), a number of experiments have been performed either with healthy organisms (various physiological aspects of growth) or with artificially diseased organisms, which may react more markedly to homeopathic treatments than healthy ones. In the latter case, the preliminary stress may be either abiotic, e.g. heavy metals, or biotic, e.g. fungal and viral pathogens or nematode infection. Research has also been carried out into the applicability of homeopathic principles to crop growth and disease control (agrohomeopathy): because of the extreme dilutions used, the environmental impact is low and such treatments are well suited to the holistic approach of sustainable agriculture (Betti et al., 2006). Unfortunately, as Scofield reported in an extensive critical review (Scofield, 1984), there is little firm evidence to support the reliability of the reported results, due to poor experimental methodology and inadequate statistical analysis. Moreover, since there is no agricultural homeopathic pharmacopoeia, much work is required to find suitable remedies, potencies and dose levels.
Resumo:
INTRODUCTION This paper focuses exclusively on experimental models with ultra high dilutions (i.e. beyond 10(-23)) that have been submitted to replication scrutiny. It updates previous surveys, considers suggestions made by the research community and compares the state of replication in 1994 with that in 2015. METHODS Following literature research, biochemical, immunological, botanical, cell biological and zoological studies on ultra high dilutions (potencies) were included. Reports were grouped into initial studies, laboratory-internal, multicentre and external replications. Repetition could yield either comparable, or zero, or opposite results. The null-hypothesis was that test and control groups would not be distinguishable (zero effect). RESULTS A total of 126 studies were found. From these, 28 were initial studies. When all 98 replicative studies were considered, 70.4% (i.e. 69) reported a result comparable to that of the initial study, 20.4% (20) zero effect and 9.2% (9) an opposite result. Both for the studies until 1994 and the studies 1995-2015 the null-hypothesis (dominance of zero results) should be rejected. Furthermore, the odds of finding a comparable result are generally higher than of finding an opposite result. Although this is true for all three types of replication studies, the fraction of comparable studies diminishes from laboratory-internal (total 82.9%) to multicentre (total 75%) to external (total 48.3%), while the fraction of opposite results was 4.9%, 10.7% and 13.8%. Furthermore, it became obvious that the probability of an external replication producing comparable results is bigger for models that had already been further scrutinized by the initial researchers. CONCLUSIONS We found 28 experimental models which underwent replication. In total, 24 models were replicated with comparable results, 12 models with zero effect, and 6 models with opposite results. Five models were externally reproduced with comparable results. We encourage further replications of studies in order to learn more about the model systems used.
Resumo:
Despite the strong increase in observational data on extrasolar planets, the processes that led to the formation of these planets are still not well understood. However, thanks to the high number of extrasolar planets that have been discovered, it is now possible to look at the planets as a population that puts statistical constraints on theoretical formation models. A method that uses these constraints is planetary population synthesis where synthetic planetary populations are generated and compared to the actual population. The key element of the population synthesis method is a global model of planet formation and evolution. These models directly predict observable planetary properties based on properties of the natal protoplanetary disc, linking two important classes of astrophysical objects. To do so, global models build on the simplified results of many specialized models that address one specific physical mechanism. We thoroughly review the physics of the sub-models included in global formation models. The sub-models can be classified as models describing the protoplanetary disc (of gas and solids), those that describe one (proto)planet (its solid core, gaseous envelope and atmosphere), and finally those that describe the interactions (orbital migration and N-body interaction). We compare the approaches taken in different global models, discuss the links between specialized and global models, and identify physical processes that require improved descriptions in future work. We then shortly address important results of planetary population synthesis like the planetary mass function or the mass-radius relationship. With these statistical results, the global effects of physical mechanisms occurring during planet formation and evolution become apparent, and specialized models describing them can be put to the observational test. Owing to their nature as meta models, global models depend on the results of specialized models, and therefore on the development of the field of planet formation theory as a whole. Because there are important uncertainties in this theory, it is likely that the global models will in future undergo significant modifications. Despite these limitations, global models can already now yield many testable predictions. With future global models addressing the geophysical characteristics of the synthetic planets, it should eventually become possible to make predictions about the habitability of planets based on their formation and evolution.
Resumo:
This paper empirically assesses whether monetary policy affects real economic activity through its affect on the aggregate supply side of the macroeconomy. Analysts typically argue that monetary policy either does not affect the real economy, the classical dichotomy, or only affects the real economy in the short run through aggregate demand new Keynesian or new classical theories. Real business cycle theorists try to explain the business cycle with supply-side productivity shocks. We provide some preliminary evidence about how monetary policy affects the aggregate supply side of the macroeconomy through its affect on total factor productivity, an important measure of supply-side performance. The results show that monetary policy exerts a positive and statistically significant effect on the supply-side of the macroeconomy. Moreover, the findings buttress the importance of countercyclical monetary policy as well as support the adoption of an optimal money supply rule. Our results also prove consistent with the effective role of monetary policy in the Great Moderation as well as the more recent rise in productivity growth.
Resumo:
This paper explores the dynamic linkages that portray different facets of the joint probability distribution of stock market returns in NAFTA (i.e., Canada, Mexico, and the US). Our examination of interactions of the NAFTA stock markets considers three issues. First, we examine the long-run relationship between the three markets, using cointegration techniques. Second, we evaluate the dynamic relationships between the three markets, using impulse-response analysis. Finally, we explore the volatility transmission process between the three markets, using a variety of multivariate GARCH models. Our results also exhibit significant volatility transmission between the second moments of the NAFTA stock markets, albeit not homogenous. The magnitude and trend of the conditional correlations indicate that in the last few years, the Mexican stock market exhibited a tendency toward increased integration with the US market. Finally, we do note that evidence exists that the Peso and Asian financial crises as well as the stock-market crash in the US affect the return and volatility time-series relationships.