903 resultados para Geo-statistical model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive radio has been proposed as a means of improving the spectrum utilisation and increasing spectrum efficiency of wireless systems. This can be achieved by allowing cognitive radio terminals to monitor their spectral environment and opportunistically access the unoccupied frequency channels. Due to the opportunistic nature of cognitive radio, the overall performance of such networks depends on the spectrum occupancy or availability patterns. Appropriate knowledge on channel availability can optimise the sensing performance in terms of spectrum and energy efficiency. This work proposes a statistical framework for the channel availability in the polarization domain. A Gaussian Normal approximation is used to model real-world occupancy data obtained through a measurement campaign in the cellular frequency bands within a realistic scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces several methodological approaches which integrate a proposed coastal management model in an interdisciplinary perspective. The research presented herein is displayed as a set of publications comprising different thematic outlooks. The thesis develops an integrated coastal geoengineering approach which is intrinsically linked to the studied maritime environments. From sandy coasts and marine works to rocky platforms and sea cliffs, this study includes field work between Caminha – Figueira da Foz (NW Portugal) and Galicia (NW Spain). The research also involves an analysis and geological-geotechnical characterisation of natural rock (armourstone) and artificial units (concrete blocks) applied to coastal structures. The main goal is to contribute to the characterisation and re-evaluation of georesources and to determine armourstone suitability and availability from its source (quarry). It was also important to diagnose the geomaterials in situ concerning their degradation/deterioration level on the basis of the current status of the coastal protection works in order to facilitate more efficient monitoring and maintenance, with economic benefits. In the rocky coast approach the coastal blocks were studied along the platform, but also the geoforms were studied from a coastal morphodynamics point of view. A shoreline evolution analysis was developed for sandy coasts through Digital Shoreline Analysis System (DSAS) extension. In addition, the spatial and statistical analysis applied to sea cliffs allowed the establishment of susceptibility zones to erosion and hazardous areas. All of these studies have different purposes and results however, there is a common denominator – GIS mapping. Hence, apart from the studied coastal environment, there is an integrated system which includes a sequence of procedures and methodologies that persisted during the research period. This is a step forward in the study of different coastal environments by using almost the same methodologies. This will allow the characterisation, monitoring and assessment of coastal protection works, rocky coasts, and shore platforms. With such data, it is possible to propose or recommend strategies for coastal and shoreline management based on several justifications in terms of social, economic, and environmental questions, or even provide a GIS-based planning support system reinforced by geocartographic decisions. Overall the development of the applied cartography embraces six stages which will allow the production of detailed maps of the maritime environment: (1) high-resolution aerial imagery surveys; (2) visual inspection and systematic monitoring; (3) applied field datasheet; (4) in situ evaluation; (5) scanline surveying; and (6) GIS mapping. This thesis covers fundamental matters that were developed over the course of scientific publication and as a consequence they represent the results obtained and discussed. The subjects directly related to the thesis architecture are: (i) cartography applied to coastal dynamics (including an art historical analysis as a tool to comprehend the coastal evolution and the littoral zone); (ii) georesources assessment (the role of cartography in georesources zoning, assessment and armourstone durability); (iii) coastal geoengineering applications and monitoring (Espinho pilot site in NW Portugal as an experimental field); (iv) rocky coast and shore platform studies and characterisation; (v) sandy and mixed environment approaches; (vi) coastal geosciences GIS mapping and photogrammetric surveying (coastal geoengineering); and (vii) shoreline change mapping and coastal management strategies (the CartGalicia Project as an example – NW Spain). Finally, all of these thematic areas were crucial to generate the conceptual models proposed and to shape the future of integrated coastal coastal geoengineering management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de Mestrado, Gestão da Água e da Costa, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2010

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Montado ecosystem in the Alentejo Region, south of Portugal, has enormous agro-ecological and economics heterogeneities. A definition of homogeneous sub-units among this heterogeneous ecosystem was made, but for them is disposal only partial statistical information about soil allocation agro-forestry activities. The paper proposal is to recover the unknown soil allocation at each homogeneous sub-unit, disaggregating a complete data set for the Montado ecosystem area using incomplete information at sub-units level. The methodological framework is based on a Generalized Maximum Entropy approach, which is developed in thee steps concerning the specification of a r order Markov process, the estimates of aggregate transition probabilities and the disaggregation data to recover the unknown soil allocation at each homogeneous sub-units. The results quality is evaluated using the predicted absolute deviation (PAD) and the "Disagegation Information Gain" (DIG) and shows very acceptable estimation errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the age of E-Business many companies faced with massive data sets that must be analysed for gaining a competitive edge. these data sets are in many instances incomplete and quite often not of very high quality. Although statistical analysis can be used to pre-process these data sets, this technique has its own limitations. In this paper we are presenting a system - and its underlying model - that can be used to test the integrity of existing data and pre-process the data into clearer data sets to be mined. LH5 is a rule-based system, capable of self-learning and is illustrated using a medical data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states by Monte Carlo simulation. This is followed by a remedial action algorithm, based on optimal power flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. In order to illustrate the application of the proposed methodology to a practical case, the paper will include a case study for the Reliability Test System (RTS) 1996 IEEE 24 BUS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial analysis and social network analysis typically take into consideration social processes in specific contexts of geographical or network space. The research in political science increasingly strives to model heterogeneity and spatial dependence. To better understand and geographically model the relationship between “non-political” events, streaming data from social networks, and political climate was the primary objective of the current study. Geographic information systems (GIS) are useful tools in the organization and analysis of streaming data from social networks. In this study, geographical and statistical analysis were combined in order to define the temporal and spatial nature of the data eminating from the popular social network Twitter during the 2014 FIFA World Cup. The study spans the entire globe because Twitter’s geotagging function, the fundamental data that makes this study possible, is not limited to a geographic area. By examining the public reactions to an inherenlty non-political event, this study serves to illuminate broader questions about social behavior and spatial dependence. From a practical perspective, the analyses demonstrate how the discussion of political topics fluсtuate according to football matches. Tableau and Rapidminer, in addition to a set basic statistical methods, were applied to find patterns in the social behavior in space and time in different geographic regions. It was found some insight into the relationship between an ostensibly non-political event – the World Cup - and public opinion transmitted by social media. The methodology could serve as a prototype for future studies and guide policy makers in governmental and non-governmental organizations in gauging the public opinion in certain geographic locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The purpose of this study was to develop a mathematical model (sine model, SIN) to describe fat oxidation kinetics as a function of the relative exercise intensity [% of maximal oxygen uptake (%VO2max)] during graded exercise and to determine the exercise intensity (Fatmax) that elicits maximal fat oxidation (MFO) and the intensity at which the fat oxidation becomes negligible (Fatmin). This model included three independent variables (dilatation, symmetry, and translation) that incorporated primary expected modulations of the curve because of training level or body composition. METHODS: Thirty-two healthy volunteers (17 women and 15 men) performed a graded exercise test on a cycle ergometer, with 3-min stages and 20-W increments. Substrate oxidation rates were determined using indirect calorimetry. SIN was compared with measured values (MV) and with other methods currently used [i.e., the RER method (MRER) and third polynomial curves (P3)]. RESULTS: There was no significant difference in the fitting accuracy between SIN and P3 (P = 0.157), whereas MRER was less precise than SIN (P < 0.001). Fatmax (44 +/- 10% VO2max) and MFO (0.37 +/- 0.16 g x min(-1)) determined using SIN were significantly correlated with MV, P3, and MRER (P < 0.001). The variable of dilatation was correlated with Fatmax, Fatmin, and MFO (r = 0.79, r = 0.67, and r = 0.60, respectively, P < 0.001). CONCLUSIONS: The SIN model presents the same precision as other methods currently used in the determination of Fatmax and MFO but in addition allows calculation of Fatmin. Moreover, the three independent variables are directly related to the main expected modulations of the fat oxidation curve. SIN, therefore, seems to be an appropriate tool in analyzing fat oxidation kinetics obtained during graded exercise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acute and chronic respiratory failure is one of the major and potentially life-threatening features in individuals with myotonic dystrophy type 1 (DM1). Despite several clinical demonstrations showing respiratory problems in DM1 patients, the mechanisms are still not completely understood. This study was designed to investigate whether the DMSXL transgenic mouse model for DM1 exhibits respiratory disorders and, if so, to identify the pathological changes underlying these respiratory problems. Using pressure plethysmography, we assessed the breathing function in control mice and DMSXL mice generated after large expansions of the CTG repeat in successive generations of DM1 transgenic mice. Statistical analysis of breathing function measurements revealed a significant decrease in the most relevant respiratory parameters in DMSXL mice, indicating impaired respiratory function. Histological and morphometric analysis showed pathological changes in diaphragmatic muscle of DMSXL mice, characterized by an increase in the percentage of type I muscle fibers, the presence of central nuclei, partial denervation of end-plates (EPs) and a significant reduction in their size, shape complexity and density of acetylcholine receptors, all of which reflect a possible breakdown in communication between the diaphragmatic muscles fibers and the nerve terminals. Diaphragm muscle abnormalities were accompanied by an accumulation of mutant DMPK RNA foci in muscle fiber nuclei. Moreover, in DMSXL mice, the unmyelinated phrenic afferents are significantly lower. Also in these mice, significant neuronopathy was not detected in either cervical phrenic motor neurons or brainstem respiratory neurons. Because EPs are involved in the transmission of action potentials and the unmyelinated phrenic afferents exert a modulating influence on the respiratory drive, the pathological alterations affecting these structures might underlie the respiratory impairment detected in DMSXL mice. Understanding mechanisms of respiratory deficiency should guide pharmaceutical and clinical research towards better therapy for the respiratory deficits associated with DM1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New density functionals representing the exchange and correlation energies (per electron) are employed, based on the electron gas model, to calculate interaction potentials of noble gas systems X2 and XY, where X (and Y) are He,Ne,Ar and Kr, and of hydrogen atomrare gas systems H-X. The exchange energy density functional is that recommended by Handler and the correlation energy density functional is a rational function involving two parameters which were optimized to reproduce the correlation energy of He atom. Application of the two parameter function to other rare gas atoms shows that it is "universal"; i. e. ,accurate for the systems considered. The potentials obtained in this work compare well with recent experimental results and are a significant improvement over those from competing statistical modelS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper assesses the empirical performance of an intertemporal option pricing model with latent variables which generalizes the Hull-White stochastic volatility formula. Using this generalized formula in an ad-hoc fashion to extract two implicit parameters and forecast next day S&P 500 option prices, we obtain similar pricing errors than with implied volatility alone as in the Hull-White case. When we specialize this model to an equilibrium recursive utility model, we show through simulations that option prices are more informative than stock prices about the structural parameters of the model. We also show that a simple method of moments with a panel of option prices provides good estimates of the parameters of the model. This lays the ground for an empirical assessment of this equilibrium model with S&P 500 option prices in terms of pricing errors.