958 resultados para frequency analysis problem


Relevância:

90.00% 90.00%

Publicador:

Resumo:

If one clear argument emerged from my doctoral thesis in political science, it is that there is no agreement as to what democracy is. There are over 40 different varieties of democracy ranging from those in the mainstream with subtle or minute differences to those playing by themselves in the corner. And many of these various types of democracy are very well argued, empirically supported, and highly relevant to certain polities. The irony is that the thing which all of these democratic varieties or the ‘basic democracy’ that all other forms of democracy stem from, is elusive. There is no international agreement in the literature or in political practice as to what ‘basic democracy’ is and that is problematic as many of us use the word ‘democracy’ every day and it is a concept of tremendous importance internationally. I am still uncertain as to why this problem has not been resolved before by far greater minds than my own, and it may have something to do with the recent growth in democratic theory this past decade and the innovative areas of thought my thesis required, but I think I’ve got the answer. By listing each type of democracy and filling the column next to this list with the literature associated with these various styles of democracy, I amassed a large and comprehensive body of textual data. My research intended to find out what these various styles of democracy had in common and to create a taxonomy (like the ‘tree of life’ in biology) of democracy to attempt at showing how various styles of democracy have ‘evolved’ over the past 5000 years.ii I then ran a word frequency analysis program or a piece of software that counts the 100 most commonly used words in the texts. This is where my logic came in as I had to make sense of these words. How did they answer what the most fundamental commonalities are between 40 different styles of democracy? I used a grounded theory analysis which required that I argue my way through these words to form a ‘theory’ or plausible explanation as to why these particular words and not others are the important ones for answering the question. It came down to the argument that all 40 styles of democracy analysed have the following in common 1) A concept of a citizenry. 2) A concept of sovereignty. 3) A concept of equality. 4) A concept of law. 5) A concept of communication. 6) And a concept of selecting officials. Thus, democracy is a defined citizenry with its own concept of sovereignty which it exercises through the institutions which support the citizenry’s understandings of equality, law, communication, and the selection of officials. Once any of these 6 concepts are defined in a particular way it creates a style of democracy. From this, we can also see that there can be more than one style of democracy active in a particular government as a citizenry is composed of many different aggregates with their own understandings of the six concepts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Acoustic emission (AE) is the phenomenon where high frequency stress waves are generated by rapid release of energy within a material by sources such as crack initiation or growth. AE technique involves recording these stress waves by means of sensors placed on the surface and subsequent analysis of the recorded signals to gather information such as the nature and location of the source. It is one of the several diagnostic techniques currently used for structural health monitoring (SHM) of civil infrastructure such as bridges. Some of its advantages include ability to provide continuous in-situ monitoring and high sensitivity to crack activity. But several challenges still exist. Due to high sampling rate required for data capture, large amount of data is generated during AE testing. This is further complicated by the presence of a number of spurious sources that can produce AE signals which can then mask desired signals. Hence, an effective data analysis strategy is needed to achieve source discrimination. This also becomes important for long term monitoring applications in order to avoid massive date overload. Analysis of frequency contents of recorded AE signals together with the use of pattern recognition algorithms are some of the advanced and promising data analysis approaches for source discrimination. This paper explores the use of various signal processing tools for analysis of experimental data, with an overall aim of finding an improved method for source identification and discrimination, with particular focus on monitoring of steel bridges.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Monitoring fetal wellbeing is a compelling problem in modern obstetrics. Clinicians have become increasingly aware of the link between fetal activity (movement), well-being, and later developmental outcome. We have recently developed an ambulatory accelerometer-based fetal activity monitor (AFAM) to record 24-hour fetal movement. Using this system, we aim at developing signal processing methods to automatically detect and quantitatively characterize fetal movements. The first step in this direction is to test the performance of the accelerometer in detecting fetal movement against real-time ultrasound imaging (taken as the gold standard). This paper reports first results of this performance analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the increasing competitiveness in global markets, many developing nations are striving to constantly improve their services in search for the next competitive edge. As a result, the demand and need for Business Process Management (BPM) in these regions is seeing a rapid rise. Yet there exists a lack of professional expertise and knowledge to cater to that need. Therefore, the development of well-structured BPM training/ education programs has become an urgent requirement for these industries. Furthermore, the lack of textbooks or other self-educating material, that go beyond the basics of BPM, further ratifies the need for case based teaching and related cases that enable the next generation of professionals in these countries. Teaching cases create an authentic learning environment where complexities and challenges of the ‘real world’ can be presented in a narrative, enabling students to evolve crucial skills such as problem analysis, problem solving, creativity within constraints as well as the application of appropriate tools (BPMN) and techniques (including best practices and benchmarking) within richer and real scenarios. The aim of this paper is to provide a comprehensive teaching case demonstrating the means to tackle any developing nation’s legacy government process undermined by inefficiency and ineffectiveness. The paper also includes thorough teaching notes The article is presented in three main parts: (i) Introduction - that provides a brief background setting the context of this paper, (ii) The Teaching Case, and (iii) Teaching notes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An aeroelastic analysis based on finite elements in space and time is used to model the helicopter rotor in forward flight. The rotor blade is represented as an elastic cantilever beam undergoing flap and lag bending, elastic torsion and axial deformations. The objective of the improved design is to reduce vibratory loads at the rotor hub that are the main source of helicopter vibration. Constraints are imposed on aeroelastic stability, and move limits are imposed on the blade elastic stiffness design variables. Using the aeroelastic analysis, response surface approximations are constructed for the objective function (vibratory hub loads). It is found that second order polynomial response surfaces constructed using the central composite design of the theory of design of experiments adequately represents the aeroelastic model in the vicinity of the baseline design. Optimization results show a reduction in the objective function of about 30 per cent. A key accomplishment of this paper is the decoupling of the analysis problem and the optimization problems using response surface methods, which should encourage the use of optimization methods by the helicopter industry. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiple Clock Domain processors provide an attractive solution to the increasingly challenging problems of clock distribution and power dissipation. They allow their chips to be partitioned into different clock domains, and each domain’s frequency (voltage) to be independently configured. This flexibility adds new dimensions to the Dynamic Voltage and Frequency Scaling problem, while providing better scope for saving energy and meeting performance demands. In this paper, we propose a compiler directed approach for MCD-DVFS. We build a formal petri net based program performance model, parameterized by settings of microarchitectural components and resource configurations, and integrate it with our compiler passes for frequency selection.Our model estimates the performance impact of a frequency setting, unlike the existing best techniques which rely on weaker indicators of domain performance such as queue occupancies(used by online methods) and slack manifestation for a particular frequency setting (software based methods).We evaluate our method with subsets of SPECFP2000,Mediabench and Mibench benchmarks. Our mean energy savings is 60.39% (versus 33.91% of the best software technique)in a memory constrained system for cache miss dominated benchmarks, and we meet the performance demands.Our ED2 improves by 22.11% (versus 18.34%) for other benchmarks. For a CPU with restricted frequency settings, our energy consumption is within 4.69% of the optimal.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Feature selection is an important first step in regional hydrologic studies (RHYS). Over the past few decades, advances in data collection facilities have resulted in development of data archives on a variety of hydro-meteorological variables that may be used as features in RHYS. Currently there are no established procedures for selecting features from such archives. Therefore, hydrologists often use subjective methods to arrive at a set of features. This may lead to misleading results. To alleviate this problem, a probabilistic clustering method for regionalization is presented to determine appropriate features from the available dataset. The effectiveness of the method is demonstrated by application to regionalization of watersheds in conterminous United States for low flow frequency analysis. Plausible homogeneous regions that are formed by using the proposed clustering method are compared with those from conventional methods of regionalization using L-moment based homogeneity tests. Results show that the proposed methodology is promising for RHYS.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Significant changes are reported in extreme rainfall characteristics over India in recent studies though there are disagreements on the spatial uniformity and causes of trends. Based on recent theoretical advancements in the Extreme Value Theory (EVT), we analyze changes in extreme rainfall characteristics over India using a high-resolution daily gridded (1 degrees latitude x 1 degrees longitude) dataset. Intensity, duration and frequency of excess rain over a high threshold in the summer monsoon season are modeled by non-stationary distributions whose parameters vary with physical covariates like the El-Nino Southern Oscillation index (ENSO-index) which is an indicator of large-scale natural variability, global average temperature which is an indicator of human-induced global warming and local mean temperatures which possibly indicate more localized changes. Each non-stationary model considers one physical covariate and the best chosen statistical model at each rainfall grid gives the most significant physical driver for each extreme rainfall characteristic at that grid. Intensity, duration and frequency of extreme rainfall exhibit non-stationarity due to different drivers and no spatially uniform pattern is observed in the changes in them across the country. At most of the locations, duration of extreme rainfall spells is found to be stationary, while non-stationary associations between intensity and frequency and local changes in temperature are detected at a large number of locations. This study presents the first application of nonstationary statistical modeling of intensity, duration and frequency of extreme rainfall over India. The developed models are further used for rainfall frequency analysis to show changes in the 100-year extreme rainfall event. Our findings indicate the varying nature of each extreme rainfall characteristic and their drivers and emphasize the necessity of a comprehensive framework to assess resulting risks of precipitation induced flooding. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The nonlinear behavior varying with the instantaneous response was analyzed through the joint time-frequency analysis method for a class of S. D. O. F nonlinear system. A masking operator an definite regions is defined and two theorems are presented. Based on these, the nonlinear system is modeled with a special time-varying linear one, called the generalized skeleton linear system (GSLS). The frequency skeleton curve and the damping skeleton curve are defined to describe the main feature of the non-linearity as well. Moreover, an identification method is proposed through the skeleton curves and the time-frequency filtering technique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the previous paper, a class of nonlinear system is mapped to a so-called skeleton linear model (SLM) based on the joint time-frequency analysis method. Behavior of the nonlinear system may be indicated quantitatively by the variance of the coefficients of SLM versus its response. Using this model we propose an identification method for nonlinear systems based on nonstationary vibration data in this paper. The key technique in the identification procedure is a time-frequency filtering method by which solution of the SLM is extracted from the response data of the corresponding nonlinear system. Two time-frequency filtering methods are discussed here. One is based on the quadratic time-frequency distribution and its inverse transform, the other is based on the quadratic time-frequency distribution and the wavelet transform. Both numerical examples and an experimental application are given to illustrate the validity of the technique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate and precise estimates of age and growth rates are essential parameters in understanding the population dynamics of fishes. Some of the more sophisticated stock assessment models, such as virtual population analysis, require age and growth information to partition catch data by age. Stock assessment efforts by regulatory agencies are usually directed at specific fisheries which are being heavily exploited and are suspected of being overfished. Interest in stock assessment of some of the oceanic pelagic fishes (tunas, billfishes, and sharks) has developed only over the last decade, during which exploitation has increased steadily in response to increases in worldwide demand for these resources. Traditionally, estimating the age of fishes has been done by enumerating growth bands on skeletal hardparts, through length frequency analysis, tag and recapture studies, and raising fish in enclosures. However, problems related to determining the age of some of the oceanic pelagic fishes are unique compared with other species. For example, sampling is difficult for these large, highly mobile fishes because of their size, extensive distributions throughout the world's oceans, and for some, such as the marlins, infrequent catches. In addition, movements of oceanic pelagic fishes often transect temperate as well as tropical oceans, making interpretation of growth bands on skeletal hardparts more difficult than with more sedentary temperate species. Many oceanic pelagics are also long-lived, attaining ages in excess of 30 yr, and more often than not, their life cycles do not lend themselves easily to artificial propagation and culture. These factors contribute to the difficulty of determining ages and are generally characteristic of this group-the tunas, billfishes, and sharks. Accordingly, the rapidly growing international concern in managing oceanic pelagic fishes, as well as unique difficulties in ageing these species, prompted us to hold this workshop. Our two major objectives for this workshop are to: I) Encourage the interchange of ideas on this subject, and 2) establish the "state of the art." A total of 65 scientists from 10 states in the continental United States and Hawaii, three provinces in Canada, France, Republic of Senegal, Spain, Mexico, Ivory Coast, and New South Wales (Australia) attended the workshop held at the Southeast Fisheries Center, Miami, Fla., 15-18 February 1982. Our first objective, encouraging the interchange of ideas, is well illustrated in the summaries of the Round Table Discussions and in the Glossary, which defines terms used in this volume. The majority of the workshop participants agreed that the lack of validation of age estimates and the means to accomplish the same are serious problems preventing advancements in assessing the age and growth of fishes, particularly oceanic pelagics. The alternatives relating to the validation problem were exhaustively reviewed during the Round Table Discussions and are a major highlight of this workshop. How well we accomplished our second objective, to establish the "state of the art" on age determination of oceanic pelagic fishes, will probably best be judged on the basis of these proceedings and whether future research efforts are directed at the problem areas we have identified. In order to produce high-quality papers, workshop participants served as referees for the manuscripts published in this volume. Several papers given orally at the workshop, and included in these proceedings, were summarized from full-length manuscripts, which have been submitted to or published in other scientific outlets-these papers are designated as SUMMARY PAPERS. In addition, the SUMMARY PAPER designation was also assigned to workshop papers that represented very preliminary or initial stages of research, cursory progress reports, papers that were data shy, or provide only brief reviews on general topics. Bilingual abstracts were included for all papers that required translation. We gratefully acknowledge the support of everyone involved in this workshop. Funding was provided by the Southeast Fisheries Center, and Jack C. Javech did the scientific illustrations appearing on the cover, between major sections, and in the Glossary. (PDF file contains 228 pages.)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): After 1960, the Santa Cruz River at Tucson, Arizona, an ephemeral stream normally dominated by summer floods, experienced an apparent increased frequency of flooding coincident with an increased percentage of annual floods occurring in fall and winter. This shift reflects large-scale and low-frequency changes in the eastern Pacific Ocean, in part associated with El Niño-Southern Oscillation (ENSO) phenomena. ... Questions are raised about the validity of standard methods of flood-frequency analysis to estimate regulatory and designed floods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A neural network theory of :3-D vision, called FACADE Theory, is described. The theory proposes a solution of the classical figure-ground problem for biological vision. It does so by suggesting how boundary representations and surface representations are formed within a Boundary Contour System (BCS) and a Feature Contour System (FCS). The BCS and FCS interact reciprocally to form 3-D boundary and surface representations that arc mutually consistent. Their interactions generate 3-D percepts wherein occluding and occluded object completed, and grouped. The theory clarifies how preattentive processes of 3-D perception and figure-ground separation interact reciprocally with attentive processes of spatial localization, object recognition, and visual search. A new theory of stereopsis is proposed that predicts how cells sensitive to multiple spatial frequencies, disparities, and orientations are combined by context-sensitive filtering, competition, and cooperation to form coherent BCS boundary segmentations. Several factors contribute to figure-ground pop-out, including: boundary contrast between spatially contiguous boundaries, whether due to scenic differences in luminance, color, spatial frequency, or disparity; partially ordered interactions from larger spatial scales and disparities to smaller scales and disparities; and surface filling-in restricted to regions surrounded by a connected boundary. Phenomena such as 3-D pop-out from a 2-D picture, DaVinci stereopsis, a 3-D neon color spreading, completion of partially occluded objects, and figure-ground reversals are analysed. The BCS and FCS sub-systems model aspects of how the two parvocellular cortical processing streams that join the Lateral Geniculate Nucleus to prestriate cortical area V4 interact to generate a multiplexed representation of Form-And-Color-And-Depth, or FACADE, within area V4. Area V4 is suggested to support figure-ground separation and to interact. with cortical mechanisms of spatial attention, attentive objcect learning, and visual search. Adaptive Resonance Theory (ART) mechanisms model aspects of how prestriate visual cortex interacts reciprocally with a visual object recognition system in inferotemporal cortex (IT) for purposes of attentive object learning and categorization. Object attention mechanisms of the What cortical processing stream through IT cortex are distinguished from spatial attention mechanisms of the Where cortical processing stream through parietal cortex. Parvocellular BCS and FCS signals interact with the model What stream. Parvocellular FCS and magnocellular Motion BCS signals interact with the model Where stream. Reciprocal interactions between these visual, What, and Where mechanisms arc used to discuss data about visual search and saccadic eye movements, including fast search of conjunctive targets, search of 3-D surfaces, selective search of like-colored targets, attentive tracking of multi-element groupings, and recursive search of simultaneously presented targets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Phytoplankton observation is the product of a number of trade-offs related to sampling processes, required level of diversity and size spectrum analysis capabilities of the techniques involved. Instruments combining the morphological and high-frequency analysis for phytoplankton cells are now available. This paper presents an application of the automated high-resolution flow cytometer Cytosub as a tool for analysing phytoplanktonic cells in their natural environment. High resolution data from a temporal study in the Bay of Marseille (analysis every 30 min over 1 month) and a spatial study in the Southern Indian Ocean (analysis every 5 min at 10 knots over 5 days) are presented to illustrate the capabilities and limitations of the instrument. Automated high-frequency flow cytometry revealed the spatial and temporal variability of phytoplankton in the size range 1−∼50 μm that could not be resolved otherwise. Due to some limitations (instrumental memory, volume analysed per sample), recorded counts could be statistically too low. By combining high-frequency consecutive samples, it is possible to decrease the counting error, following Poisson’s law, and to retain the main features of phytoplankton variability. With this technique, the analysis of phytoplankton variability combines adequate sampling frequency and effective monitoring of community changes.