917 resultados para Analysis of Algorithms and Problem Complexity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present algorithms which work on pairs of 0,1- matrices which multiply again a matrix of zero and one entries. When applied over a pair, the algorithms change the number of non-zero entries present in the matrices, meanwhile their product remains unchanged. We establish the conditions under which the number of 1s decreases. We recursively define as well pairs of matrices which product is a specific matrix and such that by applying on them these algorithms, we minimize the total number of non-zero entries present in both matrices. These matrices may be interpreted as solutions for a well known information retrieval problem, and in this case the number of 1 entries represent the complexity of the retrieve and information update operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

В постановочном плане рассмотрены вопросы введения понятия «пространство развития», виды возможных изменений системы, структура и механизмы развития. Рассмотрены типологии индикаторов развития, роль информационной компоненты и понятия качества.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A case study of an aircraft engine manufacturer is used to analyze the effects of management levers on the lead time and design errors generated in an iteration-intensive concurrent engineering process. The levers considered are amount of design-space exploration iteration, degree of process concurrency, and timing of design reviews. Simulation is used to show how the ideal combination of these levers can vary with changes in design problem complexity, which can increase, for instance, when novel technology is incorporated in a design. Results confirm that it is important to consider multiple iteration-influencing factors and their interdependencies to understand concurrent processes, because the factors can interact with confounding effects. The article also demonstrates a new approach to derive a system dynamics model from a process task network. The new approach could be applied to analyze other concurrent engineering scenarios. © The Author(s) 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comorbidity between problem gambling and internalising disorders (anxiety and depression) has long been recognised. However, it is not clear how these relationships develop, and what factors can foster resilience to both conditions. The current study draws on longitudinal cohort data to investigate: 1) the cross-sectional and longitudinal relationships between problem gambling and internalising symptoms; 2) whether there are common and/or specific social environmental factors protective against both internalising symptoms and problem gambling in young adulthood; and 3) interactive protective factors (i.e., those that moderate the relationship between problem gambling and internalising symptoms).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Discussions of gambling have traditionally focused on ideas ofproblemand “responsible” gambling. However, few studies have examined how Institutions attempt to exert social control over gamblers in order to promote so-called “responsible” behaviour. In this study, we examine the way “problemand “responsible” gambling are discussed by Australian governments and the gambling industry, using a theoretical framework based on the work of Foucault.

Method
: We conducted a thematic analysis of discourses surrounding problem and responsible gambling in government and gambling industry websites, television campaigns and responsible gambling materials.

Results:
Documents distinguished between gambling, which was positive for the community, and problem gambling, which was portrayed as harmful and requiring medical intervention. The need for responsible gambling was emphasised in many of the documents, and reinforced by mechanisms including self-monitoring, self-control and surveillance of gamblers.

Conclusions:
Government and industry expect gamblers to behave “responsibly”, and are heavily influenced by neoliberal ideas of rational, controlled subjects in their conceptualisation of what constitutes “responsible behaviour”. As a consequence, problem gamblers become constructed as a deviant group. This may have significant consequences for problem gamblers, such as the creation of stigma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many combinatorial problems coming from the real world may not have a clear and well defined structure, typically being dirtied by side constraints, or being composed of two or more sub-problems, usually not disjoint. Such problems are not suitable to be solved with pure approaches based on a single programming paradigm, because a paradigm that can effectively face a problem characteristic may behave inefficiently when facing other characteristics. In these cases, modelling the problem using different programming techniques, trying to ”take the best” from each technique, can produce solvers that largely dominate pure approaches. We demonstrate the effectiveness of hybridization and we discuss about different hybridization techniques by analyzing two classes of problems with particular structures, exploiting Constraint Programming and Integer Linear Programming solving tools and Algorithm Portfolios and Logic Based Benders Decomposition as integration and hybridization frameworks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The compelling quality of the Global Change simulation study (Altemeyer, 2003), in which high RWA (right-wing authoritarianism)/high SDO (social dominance orientation) individuals produced poor outcomes for the planet, rests on the inference that the link between high RWA/SDO scores and disaster in the simulation can be generalized to real environmental and social situations. However, we argue that studies of the Person × Situation interaction are biased to overestimate the role of the individual variability. When variables are operationalized, strongly normative items are excluded because they are skewed and kurtotic. This occurs both in the measurement of predictor constructs, such as RWA, and in the outcome constructs, such as prejudice and war. Analyses of normal linear statistics highlight personality variables such as RWA, which produce variance, and overlook the role of norms, which produce invariance. Where both normative and personality forces are operating, as in intergroup contexts, the linear analysis generates statistics for the sample that disproportionately reflect the behavior of the deviant, antinormative minority and direct attention away from the baseline, normative position. The implications of these findings for the link between high RWA and disaster are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to answer the practically important question of whether the down conductors of lightning protection systems to tall towers and buildings can be electrically isolated from the structure itself, this work is conducted. As a first step in this regard, it is presumed that the down conductor placed on metallic tower will be a pessimistic representation of the actual problem. This opinion was based on the fact that the proximity of heavy metallic structure will have a large damping effect. The post-stroke current distributions along the down conductors and towers, which can be quite different from that in the lightning channel, govern the post-stroke near field and the resulting gradient in the soil. Also, for a reliable estimation of the actual stroke current from the measured down conductor currents, it is essential to know the current distribution characteristics along the down conductors. In view of these, the present work attempts to deduce the post-stroke current and voltage distribution along typical down conductors and towers. A solution of the governing field equations on an electromagnetic model of the system is sought for the investigation. Simulation results providing the spatio-temporal distribution of the post-stroke current and voltage has provided very interesting results. It is concluded that it is almost impossible to achieve electrical isolation between the structure and the down conductor. Furthermore, there will be significant induction into the steel matrix of the supporting structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we outline a systematic procedure for scaling analysis of momentum and heat transfer in laser melted pools. With suitable choices of non-dimensionalising parameters, the governing equations coupled with appropriate boundary conditions are first scaled, and the relative significance of various terms appearing in them are accordingly analysed. The analysis is then utilised to predict the orders of magnitude of some important quantities, such as the velocity scale at the top surface, velocity boundary layer thickness, maximum temperature rise in the pool, fully developed pool-depth, and time required for initiation of melting. Using the scaling predictions, the influence of various processing parameters on the system variables can be well recognised, which enables us to develop a deeper insight into the physical problem of interest. Moreover, some of the quantities predicted from the scaling analysis can be utilised for optimised selection of appropriate grid-size and time-steps for full numerical simulation of the process. The scaling predictions are finally assessed by comparison with experimental and numerical results quoted in the literature, and an excellent qualitative agreement is observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we analyze the coexistence of a primary and a secondary (cognitive) network when both networks use the IEEE 802.11 based distributed coordination function for medium access control. Specifically, we consider the problem of channel capture by a secondary network that uses spectrum sensing to determine the availability of the channel, and its impact on the primary throughput. We integrate the notion of transmission slots in Bianchi's Markov model with the physical time slots, to derive the transmission probability of the secondary network as a function of its scan duration. This is used to obtain analytical expressions for the throughput achievable by the primary and secondary networks. Our analysis considers both saturated and unsaturated networks. By performing a numerical search, the secondary network parameters are selected to maximize its throughput for a given level of protection of the primary network throughput. The theoretical expressions are validated using extensive simulations carried out in the Network Simulator 2. Our results provide critical insights into the performance and robustness of different schemes for medium access by the secondary network. In particular, we find that the channel captures by the secondary network does not significantly impact the primary throughput, and that simply increasing the secondary contention window size is only marginally inferior to silent-period based methods in terms of its throughput performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The trapezoidal rule, which is a special case of the Newmark family of algorithms, is one of the most widely used methods for transient hyperbolic problems. In this work, we show that this rule conserves linear and angular momenta and energy in the case of undamped linear elastodynamics problems, and an ``energy-like measure'' in the case of undamped acoustic problems. These conservation properties, thus, provide a rational basis for using this algorithm. In linear elastodynamics problems, variants of the trapezoidal rule that incorporate ``high-frequency'' dissipation are often used, since the higher frequencies, which are not approximated properly by the standard displacement-based approach, often result in unphysical behavior. Instead of modifying the trapezoidal algorithm, we propose using a hybrid finite element framework for constructing the stiffness matrix. Hybrid finite elements, which are based on a two-field variational formulation involving displacement and stresses, are known to approximate the eigenvalues much more accurately than the standard displacement-based approach, thereby either bypassing or reducing the need for high-frequency dissipation. We show this by means of several examples, where we compare the numerical solutions obtained using the displacement-based and hybrid approaches against analytical solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two stages have been observed in micro-indentation experiment of a soft film on a hard substrate. In the first stage, the hardness of the thin film decreases with increasing depth of indentation when indentation is shallow; and in the second stage, the hardness of the film increases with increasing depth of indentation when the indenter tip approaches the hard substrate. In this paper, the new strain gradient theory is used to analyze the micro-indentation behavior of a soft film on a hard substrate. Meanwhile, the classic plastic theory is also applied to investigating the problem. Comparing two theoretical results with the experiment data, one can find that the strain gradient theory can describe the experiment data at both the shallow and deep indentation depths quite well, while the classic theory can't explain the experiment results.