959 resultados para Complexity analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods for the calculation of complexity have been investigated as a possible alternative for the analysis of the dynamics of molecular systems. “Computational mechanics” is the approach chosen to describe emergent behavior in molecular systems that evolve in time. A novel algorithm has been developed for symbolization of a continuous physical trajectory of a dynamic system. A method for calculating statistical complexity has been implemented and tested on representative systems. It is shown that the computational mechanics approach is suitable for analyzing the dynamic complexity of molecular systems and offers new insight into the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computational mechanics approach has been applied to the orientational behavior of water molecules in a molecular dynamics simulated water–Na + system. The distinctively different statistical complexity of water molecules in the bulk and in the first solvation shell of the ion is demonstrated. It is shown that the molecules undergo more complex orientational motion when surrounded by other water molecules compared to those constrained by the electric field of the ion. However the spatial coordinates of the oxygen atom shows the opposite complexity behavior in that complexity is higher for the solvation shell molecules. New information about the dynamics of water molecules in the solvation shell is provided that is additional to that given by traditional methods of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper seeks to advance the theory and practice of the dynamics of complex networks in relation to direct and indirect citations. It applies social network analysis (SNA) and the ordered weighted averaging operator (OWA) to study a patent citations network. So far the SNA studies investigating long chains of patents citations have rarely been undertaken and the importance of a node in a network has been associated mostly with its number of direct ties. In this research OWA is used to analyse complex networks, assess the role of indirect ties, and provide guidance to reduce complexity for decision makers and analysts. An empirical example of a set of European patents published in 2000 in the renewable energy industry is provided to show the usefulness of the proposed approach for the preference ranking of patent citations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems analysis (SA) is widely used in complex and vague problem solving. Initial stages of SA are analysis of problems and purposes to obtain problems/purposes of smaller complexity and vagueness that are combined into hierarchical structures of problems(SP)/purposes(PS). Managers have to be sure the PS and the purpose realizing system (PRS) that can achieve the PS-purposes are adequate to the problem to be solved. However, usually SP/PS are not substantiated well enough, because their development is based on a collective expertise in which logic of natural language and expert estimation methods are used. That is why scientific foundations of SA are not supposed to have been completely formed. The structure-and-purpose approach to SA based on a logic-and-linguistic simulation of problems/purposes analysis is a step towards formalization of the initial stages of SA to improve adequacy of their results, and also towards increasing quality of SA as a whole. Managers of industrial organizing systems using the approach eliminate logical errors in SP/PS at early stages of planning and so they will be able to find better decisions of complex and vague problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract (provisional): Background Failing a high-stakes assessment at medical school is a major event for those who go through the experience. Students who fail at medical school may be more likely to struggle in professional practice, therefore helping individuals overcome problems and respond appropriately is important. There is little understanding about what factors influence how individuals experience failure or make sense of the failing experience in remediation. The aim of this study was to investigate the complexity surrounding the failure experience from the student’s perspective using interpretative phenomenological analysis (IPA). Methods The accounts of 3 medical students who had failed final re-sit exams, were subjected to in-depth analysis using IPA methodology. IPA was used to analyse each transcript case-by-case allowing the researcher to make sense of the participant’s subjective world. The analysis process allowed the complexity surrounding the failure to be highlighted, alongside a narrative describing how students made sense of the experience. Results The circumstances surrounding students as they approached assessment and experienced failure at finals were a complex interaction between academic problems, personal problems (specifically finance and relationships), strained relationships with friends, family or faculty, and various mental health problems. Each student experienced multi-dimensional issues, each with their own individual combination of problems, but experienced remediation as a one-dimensional intervention with focus only on improving performance in written exams. What these students needed to be included was help with clinical skills, plus social and emotional support. Fear of termination of the their course was a barrier to open communication with staff. Conclusions These students’ experience of failure was complex. The experience of remediation is influenced by the way in which students make sense of failing. Generic remediation programmes may fail to meet the needs of students for whom personal, social and mental health issues are a part of the picture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term oxylipin is applied to the generation of oxygenated products of polyunsaturated fatty acids that can arise either through non-enzymatic or enzymatic processes generating a complex array of products, including alcohols, aldehydes, ketones, acids and hydrocarbon gases. The biosynthetic origin of these products has revealed an array of enzymes involved in their formation and more recently a radical pathway. These include lipoxygenases and α-dioxygenase that insert both oxygen atoms in to the acyl chain to initiate the pathways, to specialised P450 monooxygenases that are responsible for their downstream processing. This latter group include enzymes at the branch points such as allene oxide synthase, leading to jasmonate signalling, hydroperoxide lyase, responsible for generating pathogen/pest defensive volatiles and divinyl ether synthases and peroxygenases involved in the formation of antimicrobial compounds. The complexity of the products generated raises significant challenges for their rapid identification and quantification using metabolic screening methods. Here the current developments in oxylipin metabolism are reviewed together with the emerging technologies required to expand this important field of research that underpins advances in plant-pest/pathogen interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Online enquiry communities such as Question Answering (Q&A) websites allow people to seek answers to all kind of questions. With the growing popularity of such platforms, it is important for community managers to constantly monitor the performance of their communities. Although different metrics have been proposed for tracking the evolution of such communities, maturity, the process in which communities become more topic proficient over time, has been largely ignored despite its potential to help in identifying robust communities. In this paper, we interpret community maturity as the proportion of complex questions in a community at a given time. We use the Server Fault (SF) community, a Question Answering (Q&A) community of system administrators, as our case study and perform analysis on question complexity, the level of expertise required to answer a question. We show that question complexity depends on both the length of involvement and the level of contributions of the users who post questions within their community. We extract features relating to askers, answerers, questions and answers, and analyse which features are strongly correlated with question complexity. Although our findings highlight the difficulty of automatically identifying question complexity, we found that complexity is more influenced by both the topical focus and the length of community involvement of askers. Following the identification of question complexity, we define a measure of maturity and analyse the evolution of different topical communities. Our results show that different topical communities show different maturity patterns. Some communities show a high maturity at the beginning while others exhibit slow maturity rate. Copyright 2013 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to identify the commonalities and differences in manufacturers’ motivations to servitise. Design/methodology/approach – UK study based on interviews with 40 managers in 25 companies in 12 sectors. Using the concept of product complexity, sectors were grouped using the Complex Products and Systems (CoPS) typology: non-complex products, complex products and systems. Findings – Motivations to servitise were categorised as competitive, demand based (i.e. derived from the customer) or economic. Motivations to servitise vary according to product complexity, although cost savings and improved service quality appear important demand-based motivations for all manufacturers. Non-complex product manufacturers also focus on services to help product differentiation. For CoPS manufacturers, both risk reduction and developing a new revenue stream were important motivations. For uniquely complex product manufacturers, stabilising revenue and increased profitability were strong motivations. For uniquely systems manufacturers, customers sought business transformation, whilst new service business models were also identified. Research limitations/implications – Using the CoPS typology, this study delineates motivations to servitise by sector. The findings show varying motivations to servitise as product complexity increases, although some motivational commonality existed across all groups. Manufacturers may have products of differing complexity within their portfolio. To overcome this limitation the unit of analysis was the strategic business unit. Practical implications – Managers can reflect on and benchmark their motivation for, and opportunities from, servitisation, by considering product complexity. Originality/value – The first study to categorise servitisation motivations by product complexity. Identifying that some customers of systems manufacturers seek business transformation through outsourcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The penalty kick in football is a seemingly simplistic play; however, it has increased in complexity since 1997 when the rules changed allowing goalkeepers to move laterally along their goal line before the ball was kicked. Prior to 1997 goalkeepers were required to remain still until the ball was struck. The objective of this study was to determine the importance of the penalty kick in the modern game of football. A retrospective study of the 2002, 2006 and 2010 World Cup and the 2000, 2004 and 2008 European Championship tournaments was carried out, assessing the importance of the penalty kick in match play and shootouts and the effect of the time of the game on the shooter's success rate. This study demonstrated the conversion rate of penalties was 73% in shootouts and 68% in match play. Significantly more penalties were awarded late in the game: twice as many penalties in the second half than the first and close to four times as many in the fourth quarter vs. the first. Teams awarded penalty kicks during match play won 52%, drew 30% and lost 18% of the time; chances of winning increased to 61% if the penalty was scored, but decreased to 29% if missed. Teams participating in either the World Cup or European Championship final match had roughly a 50% chance of being involved in a penalty shootout during the tournament. Penalty shots and their outcome significantly impact match results in post 1997 football.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the methodological aspect of climate change, particularly the aggregation of costs and benefits induced by climate change on individuals, societies, economies and on the whole ecosystem. Assessing the total and/or marginal costs of environmental change is difficult because of wide range of factors that have to be involved. The subsequent study tries to capture the complexity of cost assessment on climate change therefore includes several critical factors such as scenarios and modeling, valuation and estimation, equity and discounting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, μXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation establishes a novel data-driven method to identify language network activation patterns in pediatric epilepsy through the use of the Principal Component Analysis (PCA) on functional magnetic resonance imaging (fMRI). A total of 122 subjects’ data sets from five different hospitals were included in the study through a web-based repository site designed here at FIU. Research was conducted to evaluate different classification and clustering techniques in identifying hidden activation patterns and their associations with meaningful clinical variables. The results were assessed through agreement analysis with the conventional methods of lateralization index (LI) and visual rating. What is unique in this approach is the new mechanism designed for projecting language network patterns in the PCA-based decisional space. Synthetic activation maps were randomly generated from real data sets to uniquely establish nonlinear decision functions (NDF) which are then used to classify any new fMRI activation map into typical or atypical. The best nonlinear classifier was obtained on a 4D space with a complexity (nonlinearity) degree of 7. Based on the significant association of language dominance and intensities with the top eigenvectors of the PCA decisional space, a new algorithm was deployed to delineate primary cluster members without intensity normalization. In this case, three distinct activations patterns (groups) were identified (averaged kappa with rating 0.65, with LI 0.76) and were characterized by the regions of: (1) the left inferior frontal Gyrus (IFG) and left superior temporal gyrus (STG), considered typical for the language task; (2) the IFG, left mesial frontal lobe, right cerebellum regions, representing a variant left dominant pattern by higher activation; and (3) the right homologues of the first pattern in Broca's and Wernicke's language areas. Interestingly, group 2 was found to reflect a different language compensation mechanism than reorganization. Its high intensity activation suggests a possible remote effect on the right hemisphere focus on traditionally left-lateralized functions. In retrospect, this data-driven method provides new insights into mechanisms for brain compensation/reorganization and neural plasticity in pediatric epilepsy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ackground Following incomplete spinal cord injury (iSCI), descending drive is impaired, possibly leading to a decrease in the complexity of gait. To test the hypothesis that iSCI impairs gait coordination and decreases locomotor complexity, we collected 3D joint angle kinematics and muscle parameters of rats with a sham or an incomplete spinal cord injury. Methods 12 adult, female, Long-Evans rats, 6 sham and 6 mild-moderate T8 iSCI, were tested 4 weeks following injury. The Basso Beattie Bresnahan locomotor score was used to verify injury severity. Animals had reflective markers placed on the bony prominences of their limb joints and were filmed in 3D while walking on a treadmill. Joint angles and segment motion were analyzed quantitatively, and complexity of joint angle trajectory and overall gait were calculated using permutation entropy and principal component analysis, respectively. Following treadmill testing, the animals were euthanized and hindlimb muscles removed. Excised muscles were tested for mass, density, fiber length, pennation angle, and relaxed sarcomere length. Results Muscle parameters were similar between groups with no evidence of muscle atrophy. The animals showed overextension of the ankle, which was compensated for by a decreased range of motion at the knee. Left-right coordination was altered, leading to left and right knee movements that are entirely out of phase, with one joint moving while the other is stationary. Movement patterns remained symmetric. Permutation entropy measures indicated changes in complexity on a joint specific basis, with the largest changes at the ankle. No significant difference was seen using principal component analysis. Rats were able to achieve stable weight bearing locomotion at reasonable speeds on the treadmill despite these deficiencies. Conclusions Decrease in supraspinal control following iSCI causes a loss of complexity of ankle kinematics. This loss can be entirely due to loss of supraspinal control in the absence of muscle atrophy and may be quantified using permutation entropy. Joint-specific differences in kinematic complexity may be attributed to different sources of motor control. This work indicates the importance of the ankle for rehabilitation interventions following spinal cord injury.