910 resultados para complexity metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pace at which challenges are introduced in a game has long been identified as a key determinant of both the enjoyment and difficulty experienced by game players, and their ability to learn from game play. In order to understand how to best pace challenges in games, there is great value in analysing games already demonstrated as highly engaging. Play-through videos of four puzzle games (Portal, Portal 2 Co-operative mode, Braid and Lemmings), were observed and analysed using metrics derived from a behavioural psychology understanding of how people solve problems. Findings suggest that; 1) the main skills learned in each game are introduced separately, 2) through simple puzzles that require only basic performance of that skill, 3) the player has the opportunity to practice and integrate that skill with previously learned skills, and 4) puzzles increase in complexity until the next new skill is introduced. These data provide practical guidance for designers, support contemporary thinking on the design of learning structures in games, and suggest future directions for empirical research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: This study is part of an interactive improvement intervention aimed to facilitate empowerment-based chronic kidney care using data from persons with CKD and their family members. There are many challenges to implementing empowerment-based care, and it is therefore necessary to study the implementation process. The aim of this study was to generate knowledge regarding the implementation process of an improvement intervention of empowerment for those who require chronic kidney care. Methods: A prospective single qualitative case study was chosen to follow the process of the implementation over a two year period. Twelve health care professionals were selected based on their various role(s) in the implementation of the improvement intervention. Data collection comprised of digitally recorded project group meetings, field notes of the meetings, and individual interviews before and after the improvement project. These multiple data were analyzed using qualitative latent content analysis. Results: Two facilitator themes emerged: Moving spirit and Encouragement. The healthcare professionals described a willingness to individualize care and to increase their professional development in the field of chronic kidney care. The implementation process was strongly reinforced by both the researchers working interactively with the staff, and the project group. One theme emerged as a barrier: the Limitations of the organization. Changes in the organization hindered the implementation of the intervention throughout the study period, and the lack of interplay in the organization most impeded the process. Conclusions: The findings indicated the complexity of maintaining a sustainable and lasting implementation over a period of two years. Implementing empowerment-based care was found to be facilitated by the cooperation between all involved healthcare professionals. Furthermore, long-term improvement interventions need strong encouragement from all levels of the organization to maintain engagement, even when it is initiated by the health care professionals themselves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we discuss the temporal aspects of indexing and classification in information systems. Basing this discussion off of the three sources of research of scheme change: of indexing: (1) analytical research on the types of scheme change and (2) empirical data on scheme change in systems and (3) evidence of cataloguer decision-making in the context of scheme change. From this general discussion we propose two constructs along which we might craft metrics to measure scheme change: collocative integrity and semantic gravity. The paper closes with a discussion of these constructs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research poster about indexing theory

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-replication and compartmentalization are two central properties thought to be essential for minimal life, and understanding how such processes interact in the emergence of complex reaction networks is crucial to exploring the development of complexity in chemistry and biology. Autocatalysis can emerge from multiple different mechanisms such as formation of an initiator, template self-replication and physical autocatalysis (where micelles formed from the reaction product solubilize the reactants, leading to higher local concentrations and therefore higher rates). Amphiphiles are also used in artificial life studies to create protocell models such as micelles, vesicles and oil-in-water droplets, and can increase reaction rates by encapsulation of reactants. So far, no template self-replicator exists which is capable of compartmentalization, or transferring this molecular scale phenomenon to micro or macro-scale assemblies. Here a system is demonstrated where an amphiphilic imine catalyses its own formation by joining a non-polar alkyl tail group with a polar carboxylic acid head group to form a template, which was shown to form reverse micelles by Dynamic Light Scattering (DLS). The kinetics of this system were investigated by 1H NMR spectroscopy, showing clearly that a template self-replication mechanism operates, though there was no evidence that the reverse micelles participated in physical autocatalysis. Active oil droplets, composed from a mixture of insoluble organic compounds in an aqueous sub-phase, can undergo processes such as division, self-propulsion and chemotaxis, and are studied as models for minimal cells, or protocells. Although in most cases the Marangoni effect is responsible for the forces on the droplet, the behaviour of the droplet depends heavily on the exact composition. Though theoretical models are able to calculate the forces on a droplet, to model a mixture of oils on an aqueous surface where compounds from the oil phase are dissolving and diffusing through the aqueous phase is beyond current computational capability. The behaviour of a droplet in an aqueous phase can only be discovered through experiment, though it is determined by the droplet's composition. By using an evolutionary algorithm and a liquid handling robot to conduct droplet experiments and decide which compositions to test next, entirely autonomously, the composition of the droplet becomes a chemical genome capable of evolution. The selection is carried out according to a fitness function, which ranks the formulation based on how well it conforms to the chosen fitness criteria (e.g. movement or division). Over successive generations, significant increases in fitness are achieved, and this increase is higher with more components (i.e. greater complexity). Other chemical processes such as chemiluminescence and gelation were investigated in active oil droplets, demonstrating the possibility of controlling chemical reactions by selective droplet fusion. Potential future applications for this might include combinatorial chemistry, or additional fitness goals for the genetic algorithm. Combining the self-replication and the droplet protocells research, it was demonstrated that the presence of the amphiphilic replicator lowers the interfacial tension between droplets of a reaction mixture in organic solution and the alkaline aqueous phase, causing them to divide. Periodic sampling by a liquid handling robot revealed that the extent of droplet fission increased as the reaction progressed, producing more individual protocells with increased self-replication. This demonstrates coupling of the molecular scale phenomenon of template self-replication to a macroscale physicochemical effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La teoría de la complejidad, propia del estudio de fenómenos relativos a las ciencias naturales, se muestra como un marco alternativo para comprender los eventos emergentes que surgen en el sistema internacional. Esta monografía correlaciona el lenguaje de la complejidad con las relaciones internacionales, enfocándose en la relación Visegrad—Ucrania, ya que ha sido escenario de una serie de eventos emergentes e inesperados desde las protestas civiles de noviembre de 2013 en Kiev. El sistema complejo que existe entre el Grupo Visegrad y Ucrania se ve , desde entonces, en la necesidad de adaptarse ante los recurrentes eventos emergentes y de auto organizarse. De ese modo, podrá comportarse en concordancia con escenarios impredecibles, particularmente en lo relacionado con sus interacciones energéticas y sus interconexiones políticas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo exploratorio estudia al movimiento político Mesa de la Unidad Democrática (MUD), creada con el fin de oponerse la Gobierno socialista existente en venezuela. La crítica que este documento realiza, parte desde el punto de vista de la Ciencia de la Complejidad. Algunos conceptos clave de sistemas complejos han sido utilizados para explicar el funcionamiento y organización de la MUD, esto con el objetivo de generar un diagnóstico integral de los problemas que enfrenta, y evidenciar las nuevas percepciones sobre comportamientos perjudiciales que el partido tiene actualmente. Con el enfoque de la complejidad se pretende ayudar a comprender mejor el contexto que enmarca al partido y, para, finalmente aportar una serie de soluciones a los problemas de cohesión que presen

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic Resonance Imaging (MRI) is the in vivo technique most commonly employed to characterize changes in brain structures. The conventional MRI-derived morphological indices are able to capture only partial aspects of brain structural complexity. Fractal geometry and its most popular index, the fractal dimension (FD), can characterize self-similar structures including grey matter (GM) and white matter (WM). Previous literature shows the need for a definition of the so-called fractal scaling window, within which each structure manifests self-similarity. This justifies the existence of fractal properties and confirms Mandelbrot’s assertion that "fractals are not a panacea; they are not everywhere". In this work, we propose a new approach to automatically determine the fractal scaling window, computing two new fractal descriptors, i.e., the minimal and maximal fractal scales (mfs and Mfs). Our method was implemented in a software package, validated on phantoms and applied on large datasets of structural MR images. We demonstrated that the FD is a useful marker of morphological complexity changes that occurred during brain development and aging and, using ultra-high magnetic field (7T) examinations, we showed that the cerebral GM has fractal properties also below the spatial scale of 1 mm. We applied our methodology in two neurological diseases. We observed the reduction of the brain structural complexity in SCA2 patients and, using a machine learning approach, proved that the cerebral WM FD is a consistent feature in predicting cognitive decline in patients with small vessel disease and mild cognitive impairment. Finally, we showed that the FD of the WM skeletons derived from diffusion MRI provides complementary information to those obtained from the FD of the WM general structure in T1-weighted images. In conclusion, the fractal descriptors of structural brain complexity are candidate biomarkers to detect subtle morphological changes during development, aging and in neurological diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Against a backdrop of rapidly increasing worldwide population and growing energy demand, the development of renewable energy technologies has become of primary importance in the effort to reduce greenhouse gas emissions. However, it is often technically and economically infeasible to transport discontinuous renewable electricity for long distances to the shore. Another shortcoming of non-programmable renewable power is its integration into the onshore grid without affecting the dispatching process. On the other hand, the offshore oil & gas industry is striving to reduce overall carbon footprint from onsite power generators and limiting large expenses associated to carrying electricity from remote offshore facilities. Furthermore, the increased complexity and expansion towards challenging areas of offshore hydrocarbons operations call for higher attention to safety and environmental protection issues from major accident hazards. Innovative hybrid energy systems, as Power-to-Gas (P2G), Power-to-Liquid (P2L) and Gas-to-Power (G2P) options, implemented at offshore locations, would offer the opportunity to overcome challenges of both renewable and oil & gas sectors. This study aims at the development of systematic methodologies based on proper sustainability and safety performance indicators supporting the choice of P2G, P2L and G2P hybrid energy options for offshore green projects in early design phases. An in-depth analysis of the different offshore hybrid strategies was performed. The literature reviews on existing methods proposing metrics to assess sustainability of hybrid energy systems, inherent safety of process routes in conceptual design stage and environmental protection of installations from oil and chemical accidental spills were carried out. To fill the gaps, a suite of specific decision-making methodologies was developed, based on representative multi-criteria indicators addressing technical, economic, environmental and societal aspects of alternative options. A set of five case-studies was defined, covering different offshore scenarios of concern, to provide an assessment of the effectiveness and value of the developed tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of omic data production has opened many new perspectives in the quest for modelling complexity in biophysical systems. With the capability of characterizing a complex organism through the patterns of its molecular states, observed at different levels through various omics, a new paradigm of investigation is arising. In this thesis, we investigate the links between perturbations of the human organism, described as the ensemble of crosstalk of its molecular states, and health. Machine learning plays a key role within this picture, both in omic data analysis and model building. We propose and discuss different frameworks developed by the author using machine learning for data reduction, integration, projection on latent features, pattern analysis, classification and clustering of omic data, with a focus on 1H NMR metabolomic spectral data. The aim is to link different levels of omic observations of molecular states, from nanoscale to macroscale, to study perturbations such as diseases and diet interpreted as changes in molecular patterns. The first part of this work focuses on the fingerprinting of diseases, linking cellular and systemic metabolomics with genomic to asses and predict the downstream of perturbations all the way down to the enzymatic network. The second part is a set of frameworks and models, developed with 1H NMR metabolomic at its core, to study the exposure of the human organism to diet and food intake in its full complexity, from epidemiological data analysis to molecular characterization of food structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroblastoma (NB) is the most common type of tumor in infants and the third most common cancer in children. Current clinical practices employ a variety of strategies for NB treatment, ranging from standard chemotherapy to immunotherapy. Due to a lack of knowledge about the molecular mechanisms underlying the disease's onset, aggressive phenotype, and therapeutic resistance, these approaches are ineffective in the majority of instances. MYCN amplification is one of the most well-known genetic alterations associated with high risk in NB. The following work is divided into three sections and aims to provide new insights into the biology of NB and hypothetical new treatment strategies. First, we identified RUNX1T1 as a key gene involved in MYCN-driven NB onset in a transgenic mouse model. Our results suggested that that RUNX1T1 may recruit the Co-REST complex on target genes that regulate the differentiation of NB cells and that the interaction with RCOR3 is essential. Second, we provided insights into the role of MYCN in dysregulating the CDK/RB/E2F pathway controlling the G1/S transition of the cell cycle. We found that RB is dispensable in regulating MYCN amplified NB's cell cycle, providing the rationale for using cyclin/CDK complexes inhibitors in NBs carrying MYCN amplification and relatively high levels of RB1 expression. Third, we generated an M13 bacteriophage platform to target GD2-expressing cells in NB. Here, we generated a recombinant M13 phage capable of binding GD2-expressing cells selectively (M13GD2). Our results showed that M13GD2 chemically conjugated with the photosensitizer ECB04 preserves the retargeting capability, inducing cell death even at picomolar concentrations upon light irradiation. These results provided proof of concept for M13 phage employment in targeted photodynamic therapy for NB, an exciting strategy to overcome resistance to classical immunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recent decades, robotics has become firmly embedded in areas such as education, teaching, medicine, psychology and many others. We focus here on social robotics; social robots are designed to interact with people in a natural and interpersonal way, often to achieve positive results in different applications. To interact and cooperate with humans in their daily-life activities, robots should exhibit human-like intelligence. The rapid expansion of social robotics and the existence of various kinds of robots on the market have allowed research groups to carry out multiple experiments. The experiments carried out have led to the collections of various kinds of data, which can be used or processed for psychological studies, and studies in other fields. However, there are no tools available in which data can be stored, processed and shared with other research groups. This thesis proposes the design and implementation of visual tool for organizing dataflows in Human Robot Interaction (HRI).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In modern society, security issues of IT Systems are intertwined with interdisciplinary aspects, from social life to sustainability, and threats endanger many aspects of every- one’s daily life. To address the problem, it’s important that the systems that we use guarantee a certain degree of security, but to achieve this, it is necessary to be able to give a measure to the amount of security. Measuring security is not an easy task, but many initiatives, including European regulations, want to make this possible. One method of measuring security is based on the use of security metrics: those are a way of assessing, from various aspects, vulnera- bilities, methods of defense, risks and impacts of successful attacks then also efficacy of reactions, giving precise results using mathematical and statistical techniques. I have done literature research to provide an overview on the meaning, the effects, the problems, the applications and the overall current situation over security metrics, with particular emphasis in giving practical examples. This thesis starts with a summary of the state of the art in the field of security met- rics and application examples to outline the gaps in current literature, the difficulties found in the change of application context, to then advance research questions aimed at fostering the discussion towards the definition of a more complete and applicable view of the subject. Finally, it stresses the lack of security metrics that consider interdisciplinary aspects, giving some potential starting point to develop security metrics that cover all as- pects involved, taking the field to a new level of formal soundness and practical usability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sales prediction plays a huge role in modern business strategies. One of it's many use cases revolves around estimating the effects of promotions. While promotions generally have a positive effect on sales of the promoted product, they can also have a negative effect on those of other products. This phenomenon is calles sales cannibalisation. Sales cannibalisation can pose a big problem to sales forcasting algorithms. A lot of times, these algorithms focus on sales over time of a single product in a single store (a couple). This research focusses on using knowledge of a product across multiple different stores. To achieve this, we applied transfer learning on a neural model developed by Kantar Consulting to demo an approach to estimating the effect of cannibalisation. Our results show a performance increase of between 10 and 14 percent. This is a very good and desired result, and Kantar will use the approach when integrating this test method into their actual systems.