950 resultados para Mathematical and statistical techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aerosols in the atmosphere play major role in the radiation balance of the Earthatmosphere system. Direct and indirect impact of aerosols on the weather and climate still remains as a topic to be investigated in detail. The effect of aerosols on the radiation budget and thereby circulation pattern is important and requires further study. A detailed analysis of the aerosol properties, their variability and meteorological processes that affect the aerosol properties and distribution over the Indian region is performed in the thesis. The doctoral thesis entitled “Characteristics of aerosols over the Indian region and their variability associated with atmospheric conditions” contains 7 chapters. This thesis presents results on the analysis on the distribution (spatial and temporal) and characteristics of the aerosols over the Indian region and adjoining seas. Regional and stationwise data were analysed and methods such as modeling and statistical analysis are implemented to understand the aerosol properties, classification and transportation. Chapter-1 presents a brief introduction on the aerosols, their measurement techniques, impact of aerosols on the atmospheric radiation budget, climatic and geographic features of the study area and the literature review on the previous studies. It provides a basic understanding in the field of study and objective of the thesis. Definition of the aerosols, their sources/sinks and classification of the particles according to optical and microphysical properties are described. Different measurement techniques such as sampling and remote sensing methods are explained in detail. Physical parameters used to describe aerosol properties and effect of aerosols on the radiation distribution are also discussed. The chapter also explains the objectives of the thesis and description of climatic features of the study area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface flow types (SFT) are advocated as ecologically relevant hydraulic units, often mapped visually from the bankside to characterise rapidly the physical habitat of rivers. SFT mapping is simple, non-invasive and cost-efficient. However, it is also qualitative, subjective and plagued by difficulties in recording accurately the spatial extent of SFT units. Quantitative validation of the underlying physical habitat parameters is often lacking, and does not consistently differentiate between SFTs. Here, we investigate explicitly the accuracy, reliability and statistical separability of traditionally mapped SFTs as indicators of physical habitat, using independent, hydraulic and topographic data collected during three surveys of a c. 50m reach of the River Arrow, Warwickshire, England. We also explore the potential of a novel remote sensing approach, comprising a small unmanned aerial system (sUAS) and Structure-from-Motion photogrammetry (SfM), as an alternative method of physical habitat characterisation. Our key findings indicate that SFT mapping accuracy is highly variable, with overall mapping accuracy not exceeding 74%. Results from analysis of similarity (ANOSIM) tests found that strong differences did not exist between all SFT pairs. This leads us to question the suitability of SFTs for characterising physical habitat for river science and management applications. In contrast, the sUAS-SfM approach provided high resolution, spatially continuous, spatially explicit, quantitative measurements of water depth and point cloud roughness at the microscale (spatial scales ≤1m). Such data are acquired rapidly, inexpensively, and provide new opportunities for examining the heterogeneity of physical habitat over a range of spatial and temporal scales. Whilst continued refinement of the sUAS-SfM approach is required, we propose that this method offers an opportunity to move away from broad, mesoscale classifications of physical habitat (spatial scales 10-100m), and towards continuous, quantitative measurements of the continuum of hydraulic and geomorphic conditions which actually exists at the microscale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ocean and its resources are increasingly seen as indispensable in addressing the multiple challenges the planet is facing in the decades to come. It has never been easy to quantify this particular sector of the economy, in any country, given the lack of a detailed, centralized data base with adequate specifics covering the necessary sectors, this article aims to compare the existing ocean economy statistical systems, especially Asia-Pacific, American and European countries, in order to overcome the deficiencies with regard to the diversity of definitions and statistical representations of ocean sectors, establish the standard statistical system and compile data for the global ocean economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last thirty years, the emergence and progression of biologging technology has led to great advances in marine predator ecology. Large databases of location and dive observations from biologging devices have been compiled for an increasing number of diving predator species (such as pinnipeds, sea turtles, seabirds and cetaceans), enabling complex questions about animal activity budgets and habitat use to be addressed. Central to answering these questions is our ability to correctly identify and quantify the frequency of essential behaviours, such as foraging. Despite technological advances that have increased the quality and resolution of location and dive data, accurately interpreting behaviour from such data remains a challenge, and analytical methods are only beginning to unlock the full potential of existing datasets. This review evaluates both traditional and emerging methods and presents a starting platform of options for future studies of marine predator foraging ecology, particularly from location and two-dimensional (time-depth) dive data. We outline the different devices and data types available, discuss the limitations and advantages of commonly-used analytical techniques, and highlight key areas for future research. We focus our review on pinnipeds - one of the most studied taxa of marine predators - but offer insights that will be applicable to other air-breathing marine predator tracking studies. We highlight that traditionally-used methods for inferring foraging from location and dive data, such as first-passage time and dive shape analysis, have important caveats and limitations depending on the nature of the data and the research question. We suggest that more holistic statistical techniques, such as state-space models, which can synthesise multiple track, dive and environmental metrics whilst simultaneously accounting for measurement error, offer more robust alternatives. Finally, we identify a need for more research to elucidate the role of physical oceanography, device effects, study animal selection, and developmental stages in predator behaviour and data interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Institutions are widely regarded as important, even ultimate drivers of economic growth and performance. A recent mainstream of institutional economics has concentrated on the effect of persisting, often imprecisely measured institutions and on cataclysmic events as agents of noteworthy institutional change. As a consequence, institutional change without large-scale shocks has received little attention. In this dissertation I apply a complementary, quantitative-descriptive approach that relies on measures of actually enforced institutions to study institutional persistence and change over a long time period that is undisturbed by the typically studied cataclysmic events. By placing institutional change into the center of attention one can recognize different speeds of institutional innovation and the continuous coexistence of institutional persistence and change. Specifically, I combine text mining procedures, network analysis techniques and statistical approaches to study persistence and change in England’s common law over the Industrial Revolution (1700-1865). Based on the doctrine of precedent - a peculiarity of common law systems - I construct and analyze the apparently first citation network that reflects lawmaking in England. Most strikingly, I find large-scale change in the making of English common law around the turn of the 19th century - a period free from the typically studied cataclysmic events. Within a few decades a legal innovation process with low depreciation rates (1 to 2 percent) and strong past-persistence transitioned to a present-focused innovation process with significantly higher depreciation rates (4 to 6 percent) and weak past-persistence. Comparison with U.S. Supreme Court data reveals a similar U.S. transition towards the end of the 19th century. The English and U.S. transitions appear to have unfolded in a very specific manner: a new body of law arose during the transitions and developed in a self-referential manner while the existing body of law lost influence, but remained prominent. Additional findings suggest that Parliament doubled its influence on the making of case law within the first decades after the Glorious Revolution and that England’s legal rules manifested a high degree of long-term persistence. The latter allows for the possibility that the often-noted persistence of institutional outcomes derives from the actual persistence of institutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the development and evaluation of PICTOAPRENDE, which is an interactive software designed to improve oral communication. Additionally, it contributes to the development of children and youth who are diagnosed with autism spectrum disorder (ASD) in Ecuador. To fulfill this purpose initially analyzes the intervention area where the general characteristics of people with ASD and their status in Ecuador is described. Statistical techniques used for this evaluation constitutes the basis of this study. A section that presents the development of research-based cognitive and social parameters of the area of intervention is also shown. Finally, the algorithms to obtain the measurements and experimental results along with the analysis of them are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to categorize Brazilian Internet users according to the diversity of their online activities and to assess the propensity of these Internet users´ groups to use electronic government (e-gov) services. The Amartya Sen’s Capability Approach was adopted as the theoretical framework for its consideration of people’s freedom to decide on their use of available resources and their competencies for these decisions, leading to the use of e-gov services. Multivariate statistical techniques were used to perform data analysis from the 2007, 2009 and 2011 editions of ICT Household Survey. The results showed that Internet users belonging to the advanced and intermediate use groups were more likely to use e-gov services than those who belong to the sporadic use group. Moreover, the results also demonstrated that the Internet user group of intermediate use presented a higher tendency to use e-gov services than the Internet user group of advanced use. This tendency is possibly related to the extensive use of interactive and collaborative activities of leisure and entertainment performed by this type of user. The findings of this research may be useful in guiding public policies for the dissemination and provision of electronic government services in Brazil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For this project I prepared a series of recitals featuring music for horn and percussion, in which the horn part featured extended horn techniques. For this project, I considered anything beyond the open or muted horn an extended technique. These techniques range from the common hand-stopped note passages to complex new techniques involving half-valves, multi-phonics, and more, for new sounds desired by the composer. There are several pieces written for solo horn and percussion, with ensembles ranging from simple duets to solo horn with a full percussion ensemble. However, few include extended techniques for the horn. All of these select pieces are lesser known because of their difficulty, primarily because of the challenge of the extended techniques requested by the composer. In the introduction to this paper I give a brief background to the project, where the current repertoire stands, and my experiences with commissioning works for this genre. I then give a brief history and how-to on the more common extended techniques, which were found in almost every piece. I separated these techniques so that they could be referenced in the performance notes without being extremely repetitive in their description. Then follows the main performance notes of the repertoire chosen, which includes a brief description of the piece itself and a longer discussion for performers and composers who wish to learn more about these techniques. In this section my primary focus is the extended techniques used and I provide score samples with permission to further the education of the next musicians to tackle this genre. All works performed for this project were recorded and accompany this paper in the Digital Repository at the University of Maryland (DRUM). The following works were included in this project: o Howard J. Buss, Dreams from the Shadows (2015) o Howard J. Buss, Night Tide (1995) o George Crumb, An Idyll for the Misbegotten, trans. Robert Patterson (1986/1997) o Charles Fernandez, Metamorphosis: A Horn’s Life, “Prenatal and Toddler” (2016, unfinished) o Helen Gifford, Of Old Angkor (1995) o Douglas Hill, Thoughtful Wanderings… (1990) o Pierre-Yves Level, Duetto pour Cor en Fa et Percussion (1999) o David Macbride, Elegy for Horn and Timpani (2009) o Brian Prechtl, A Song of David (1995) o Verne Reynolds, HornVibes (1986) o Pablo Salazar, Cincontar (2016) o Mark Schultz, Dragons in the Sky (1989) o Faye-Ellen Silverman, Protected Sleep (2007) o Charles Taylor, Sonata for Horn and Marimba (1991) o Robert Wolk, Tessellations (2016) With this project, I intend to promote these pieces and the techniques used to encourage more works written in this style, and reveal to fellow horn players that the techniques should not prevent these great works from being performed. Due to the lack of repertoire, I successfully commissioned new pieces featuring extended techniques, which were featured in the final recital.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to increasing integration density and operating frequency of today's high performance processors, the temperature of a typical chip can easily exceed 100 degrees Celsius. However, the runtime thermal state of a chip is very hard to predict and manage due to the random nature in computing workloads, as well as the process, voltage and ambient temperature variability (together called PVT variability). The uneven nature (both in time and space) of the heat dissipation of the chip could lead to severe reliability issues and error-prone chip behavior (e.g. timing errors). Many dynamic power/thermal management techniques have been proposed to address this issue such as dynamic voltage and frequency scaling (DVFS), clock gating and etc. However, most of such techniques require accurate knowledge of the runtime thermal state of the chip to make efficient and effective control decisions. In this work we address the problem of tracking and managing the temperature of microprocessors which include the following sub-problems: (1) how to design an efficient sensor-based thermal tracking system on a given design that could provide accurate real-time temperature feedback; (2) what statistical techniques could be used to estimate the full-chip thermal profile based on very limited (and possibly noise-corrupted) sensor observations; (3) how do we adapt to changes in the underlying system's behavior, since such changes could impact the accuracy of our thermal estimation. The thermal tracking methodology proposed in this work is enabled by on-chip sensors which are already implemented in many modern processors. We first investigate the underlying relationship between heat distribution and power consumption, then we introduce an accurate thermal model for the chip system. Based on this model, we characterize the temperature correlation that exists among different chip modules and explore statistical approaches (such as those based on Kalman filter) that could utilize such correlation to estimate the accurate chip-level thermal profiles in real time. Such estimation is performed based on limited sensor information because sensors are usually resource constrained and noise-corrupted. We also took a further step to extend the standard Kalman filter approach to account for (1) nonlinear effects such as leakage-temperature interdependency and (2) varying statistical characteristics in the underlying system model. The proposed thermal tracking infrastructure and estimation algorithms could consistently generate accurate thermal estimates even when the system is switching among workloads that have very distinct characteristics. Through experiments, our approaches have demonstrated promising results with much higher accuracy compared to existing approaches. Such results can be used to ensure thermal reliability and improve the effectiveness of dynamic thermal management techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this study was the evaluation of a fast Gradient Spin Echo Technique (GraSE) for cardiac T2-mapping, combining a robust estimation of T2 relaxation times with short acquisition times. The sequence was compared against two previously introduced T2-mapping techniques in a phantom and in vivo. Methods: Phantom experiments were performed at 1.5 T using a commercially available cylindrical gel phantom. Three different T2-mapping techniques were compared: a Multi Echo Spin Echo (MESE; serving as a reference), a T2-prepared balanced Steady State Free Precession (T2prep) and a Gradient Spin Echo sequence. For the subsequent in vivo study, 12 healthy volunteers were examined on a clinical 1.5 T scanner. The three T2-mapping sequences were performed at three short-axis slices. Global myocardial T2 relaxation times were calculated and statistical analysis was performed. For assessment of pixel-by-pixel homogeneity, the number of segments showing an inhomogeneous T2 value distribution, as defined by a pixel SD exceeding 20 % of the corresponding observed T2 time, was counted. Results: Phantom experiments showed a greater difference of measured T2 values between T2prep and MESE than between GraSE and MESE, especially for species with low T1 values. Both, GraSE and T2prep resulted in an overestimation of T2 times compared to MESE. In vivo, significant differences between mean T2 times were observed. In general, T2prep resulted in lowest (52.4 +/- 2.8 ms) and GraSE in highest T2 estimates (59.3 +/- 4.0 ms). Analysis of pixel-by-pixel homogeneity revealed the least number of segments with inhomogeneous T2 distribution for GraSE-derived T2 maps. Conclusions: The GraSE sequence is a fast and robust sequence, combining advantages of both MESE and T2prep techniques, which promises to enable improved clinical applicability of T2-mapping in the future. Our study revealed significant differences of derived mean T2 values when applying different sequence designs. Therefore, a systematic comparison of different cardiac T2-mapping sequences and the establishment of dedicated reference values should be the goal of future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we construct a model for the simultaneous compaction by which clusters are restructured, and growth of clusters by pairwise coagulation. The model has the form of a multicomponent aggregation problem in which the components are cluster mass and cluster diameter. Following suitable approximations, exact explicit solutions are derived which may be useful for the verification of simulations of such systems. Numerical simulations are presented to illustrate typical behaviour and to show the accuracy of approximations made in deriving the model. The solutions are then simplified using asymptotic techniques to show the relevant timescales of the kinetic processes and elucidate the shape of the cluster distribution functions at large times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O insucesso escolar em Portugal é preocupante e é um tema que vem despertando interesse crescente por parte dos políticos, dos professores, dos pais e do público em geral. Entre as disciplinas que mais contribuem para esse insucesso está a Matemática. O insucesso na Matemática é uma realidade incontornável, visível não apenas pelos maus resultados alcançados pelos alunos em testes e exames, mas também pelas enormes dificuldades manifestadas por eles na resolução de problemas, no raciocínio matemático e, sobretudo, pelo seu desinteresse em relação à Matemática. Nesta dissertação procurou aliar-se as metodologias de Estatística Multivariada, nomeadamente de Regressão Logística e de Análise de Clusters, ao insucesso escolar. As técnicas estatísticas foram aplicadas a uma base de dados construída para o efeito a partir dos resultados obtidos num inquérito aplicado aos alunos de uma escola, com o objetivo de investigar associações entre o (in)sucesso dos alunos 3.º Ciclo do Ensino Básico na disciplina de Matemática e um conjunto de variáveis referentes a dados pessoais, familiares e escolares dos alunos. Os resultados obtidos através da Análise de Regressão Logistica sugerem que a repetência a Matemática (nível inferior a 3 no final do ano letivo) está dependente da idade do aluno, das dificuldades sentidas, da participação nas aulas, do empenho e do comportamento do aluno. A Análise de Clusters procurou agrupar as variáveis em grupos homogéneos relativamente a características comuns. Verificou-se que as variáveis comportamento do aluno, fazer os trabalhos de casa e a relação com a professora estão fortemente correlacionadas, bem como as variáveis dificuldades e repetência, mas estas duas últimas bastante distantes das restantes. Da aplicação do Método Duas Etapas às variáveis de opinião, fatores de insucesso escolar e medidas para colmatar esse insucesso, resultou três Clusters, dois grandes grupos, um em que predomina a resposta não concordo nem discordo e outro em que predomina o concordo totalmente. Nesta dissertação conseguiu-se identificar importantes factores que podem ajudar a reduzir o insucesso, pelo que os seus resultados poderão ser usados no futuro na elaboração de medidas para melhorar o sucesso.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To compare oral bioavailability and pharmacokinetic parameters of different lornoxicam formulations and to assess similarity in plasma level profiles by statistical techniques. Methods: An open-label, two-period crossover trial was followed in 24 healthy Pakistani volunteers (22 males, 2 females). Each participant received a single dose of lornoxicam controlled release (CR) microparticles and two doses (morning and evening) of conventional lornoxicam immediate release (IR) tablet formulation. The microparticles were prepared by spray drying method. The formulations were administered again in an alternate manner after a washout period of one week. Pharmacokinetic parameters were determined by Kinetica 4.0 software using plasma concentration-time data. Moreover, data were statistically analyzed at 90 % confidence interval (CI) and Schuirmann’s two one-sided t-test procedure. Results: Peak plasma concentration (Cmax) was 20.2 % lower for CR formulation compared to IR formulation (270.90 ng/ml vs 339.44 ng/ml, respectively) while time taken to attain Cmax (tmax) was 5.25 and 2.08 h, respectively. Area under the plasma drug level versus time (AUC) curve was comparable for both CR and IR formulations. The 90 % confidence interval (CI) values computed for Cmax, AUC0-24, and AUC0-∞ , after log transformation, were 87.21, 108.51 and 102.74 %, respectively, and were within predefined bioequivalence range (80 - 125 %). Conclusion: The findings suggest that CR formulation of lornoxicam did not change the overall pharmacokinetic properties of lornoxicam in terms of extent and rate of lornoxicam absorption.