857 resultados para complexity of agents


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The loss of habitat and biodiversity worldwide has led to considerable resources being spent for conservation purposes on actions such as the acquisition and management of land, the rehabilitation of degraded habitats, and the purchase of easements from private landowners. Prioritising these actions is challenging due to the complexity of the problem and because there can be multiple actors undertaking conservation actions, often with divergent or partially overlapping objectives. We use a modelling framework to explore this issue with a study involving two agents sequentially purchasing land for conservation. We apply our model to simulated data using distributions taken from real data to simulate the cost of patches and the rarity and co-occurence of species. In our model each agent attempted to implement a conservation network that met its target for the minimum cost using the conservation planning software Marxan. We examine three scenarios where the conservation targets of the agents differ. The first scenario (called NGO-NGO) models the situation where two NGOs are both are targeting different sets of threatened species. The second and third scenarios (called NGO-Gov and Gov-NGO, respectively) represent a case where a government agency attempts to implement a complementary conservation network representing all species, while an NGO is focused on achieving additional protection for the most endangered species. For each of these scenarios we examined three types of interactions between agents: i) acting in isolation where the agents are attempting to achieve their targets solely though their own actions ii) sharing information where each agent is aware of the species representation achieved within the other agent’s conservation network and, iii) pooling resources where agents combine their resources and undertake conservation actions as a single entity. The latter two interactions represent different types of collaborations and in each scenario we determine the cost savings from sharing information or pooling resources. In each case we examined the utility of these interactions from the viewpoint of the combined conservation network resulting from both agents' actions, as well as from each agent’s individual perspective. The costs for each agent to achieve their objectives varied depending on the order in which the agents acted, the type of interaction between agents, and the specific goals of each agent. There were significant cost savings from increased collaboration via sharing information in the NGO-NGO scenario were the agent’s representation goals were mutually exclusive (in terms of specie targeted). In the NGO-Gov and Gov-NGO scenarios, collaboration generated much smaller savings. If the two agents collaborate by pooling resources there are multiple ways the total cost could be shared between both agents. For each scenario we investigate the costs and benefits for all possible cost sharing proportions. We find that there are a range of cost sharing proportions where both agents can benefit in the NGO-NGO scenarios while the NGO-Gov and Gov-NGO scenarios again showed little benefit. Although the model presented here has a range of simplifying assumptions, it demonstrates that the value of collaboration can vary significantly in different situations. In most cases, collaborating would have associated costs and these costs need to be weighed against the potential benefits from collaboration. The model demonstrates a method for determining the range of collaboration costs that would result in collaboration providing an efficient use of scarce conservation resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The loss of habitat and biodiversity worldwide has led to considerable resources being spent on conservation interventions. Prioritising these actions is challenging due to the complexity of the problem and because there can be multiple actors undertaking conservation actions, often with divergent or partially overlapping objectives. We explore this issue with a simulation study involving two agents sequentially purchasing land for the conservation of multiple species using three scenarios comprising either divergent or partially overlapping objectives between the agents. The first scenario investigates the situation where both agents are targeting different sets of threatened species. The second and third scenarios represent a case where a government agency attempts to implement a complementary conservation network representing 200 species, while a non-government organisation is focused on achieving additional protection for the ten rarest species. Simulated input data was generated using distributions taken from real data to model the cost of parcels, and the rarity and co-occurrence of species. We investigated three types of collaborative interactions between agents: acting in isolation, sharing information and pooling resources with the third option resulting in the agents combining their resources and effectively acting as a single entity. In each scenario we determine the cost savings when an agent moves from acting in isolation to either sharing information or pooling resources with the other agent. The model demonstrates how the value of collaboration can vary significantly in different situations. In most cases, collaborating would have associated costs and these costs need to be weighed against the potential benefits from collaboration. Our model demonstrates a method for determining the range of costs that would result in collaboration providing an efficient use of scarce conservation resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates the role of translation and interpreting in political discourse. It illustrates discursive events in the domain of politics and the resulting discourse types, such as jointly produced texts, press conferences and speeches. It shows that methods of Critical Discourse Analysis can be used effectively to reveal translation and interpreting strategies as well as transformations that occur in recontextualisation processes across languages, cultures, and discourse domains, in particular recontextualisation in mass media. It argues that the complexity of translational activities in the field of politics has not yet seen sufficient attention within Translation Studies. The article concludes by outlining a research programme for investigating political discourse in translation. ©2012 John Benjamins Publishing Company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ackground Following incomplete spinal cord injury (iSCI), descending drive is impaired, possibly leading to a decrease in the complexity of gait. To test the hypothesis that iSCI impairs gait coordination and decreases locomotor complexity, we collected 3D joint angle kinematics and muscle parameters of rats with a sham or an incomplete spinal cord injury. Methods 12 adult, female, Long-Evans rats, 6 sham and 6 mild-moderate T8 iSCI, were tested 4 weeks following injury. The Basso Beattie Bresnahan locomotor score was used to verify injury severity. Animals had reflective markers placed on the bony prominences of their limb joints and were filmed in 3D while walking on a treadmill. Joint angles and segment motion were analyzed quantitatively, and complexity of joint angle trajectory and overall gait were calculated using permutation entropy and principal component analysis, respectively. Following treadmill testing, the animals were euthanized and hindlimb muscles removed. Excised muscles were tested for mass, density, fiber length, pennation angle, and relaxed sarcomere length. Results Muscle parameters were similar between groups with no evidence of muscle atrophy. The animals showed overextension of the ankle, which was compensated for by a decreased range of motion at the knee. Left-right coordination was altered, leading to left and right knee movements that are entirely out of phase, with one joint moving while the other is stationary. Movement patterns remained symmetric. Permutation entropy measures indicated changes in complexity on a joint specific basis, with the largest changes at the ankle. No significant difference was seen using principal component analysis. Rats were able to achieve stable weight bearing locomotion at reasonable speeds on the treadmill despite these deficiencies. Conclusions Decrease in supraspinal control following iSCI causes a loss of complexity of ankle kinematics. This loss can be entirely due to loss of supraspinal control in the absence of muscle atrophy and may be quantified using permutation entropy. Joint-specific differences in kinematic complexity may be attributed to different sources of motor control. This work indicates the importance of the ankle for rehabilitation interventions following spinal cord injury.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigation of the performance of engineering project organizations is critical for understanding and eliminating inefficiencies in today’s dynamic global markets. The existing theoretical frameworks consider project organizations as monolithic systems and attribute the performance of project organizations to the characteristics of the constituents. However, project organizations consist of complex interdependent networks of agents, information, and resources whose interactions give rise to emergent properties that affect the overall performance of project organizations. Yet, our understanding of the emergent properties in project organizations and their impact on project performance is rather limited. This limitation is one of the major barriers towards creation of integrated theories of performance assessment in project organizations. The objective of this paper is to investigate the emergent properties that affect the ability of project organization to cope with uncertainty. Based on the theories of complex systems, we propose and test a novel framework in which the likelihood of performance variations in project organizations could be investigated based on the environment of uncertainty (i.e., static complexity, dynamic complexity, and external source of disruption) as well as the emergent properties (i.e., absorptive capacity, adaptive capacity, and restorative capacity) of project organizations. The existence and significance of different dimensions of the environment of uncertainty and emergent properties in the proposed framework are tested based on the analysis of the information collected from interviews with senior project managers in the construction industry. The outcomes of this study provide a novel theoretical lens for proactive bottom-up investigation of performance in project organizations at the interface of emergent properties and uncertainty

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phosphodiesterase 4 (PDE4) family are cAMP specific phosphodiesterases that play an important role in the inflammatory response and is the major PDE type found in inflammatory cells. A significant number of PDE4 specific inhibitors have been developed and are currently being investigated for use as therapeutic agents. Apremilast, a small molecule inhibitor of PDE 4 is in development for chronic inflammatory disorders and has shown promise for the treatment of psoriasis, psoriatic arthritis as well as other inflammatory diseases. It has been found to be safe and well tolerated in humans and in March 2014 it was approved by the US food and drug administration for the treatment of adult patients with active psoriatic arthritis. The only other PDE4 inhibitor on the market is Roflumilast and it is used for treatment of respiratory disease. Roflumilast is approved in the EU for the treatment of COPD and was recently approved in the US for treatment to reduce the risk of COPD exacerbations. Roflumilast is also a selective PDE4 inhibitor, administered as an oral tablet once daily, and is thought to act by increasing cAMP within lung cells. As both (Apremilast and Roflumilast) compounds selectively inhibit PDE4 but are targeted at different diseases, there is a need for a clear understanding of their mechanism of action (MOA). Differences and similarity of MOA should be defined for the purposes of labelling, for communication to the scientific community, physicians, and patients, and for an extension of utility to other diseases and therapeutic areas. In order to obtain a complete comparative picture of the MOA of both inhibitors, additional molecular and cellular biology studies are required to more fully elucidate the signalling mediators downstream of PDE4 inhibition which result in alterations in pro- and anti-inflammatory gene expression. My studies were conducted to directly compare Apremilast with Roflumilast, in order to substantiate the differences observed in the molecular and cellular effects of these compounds, and to search for other possible differentiating effects. Therefore the main aim of this thesis was to utilise cutting-edge biochemical techniques to discover whether Apremilast and Roflumilast work with different modes of action. In the first part of my thesis I used novel genetically encoded FRET based cAMP sensors targeted to different intracellular compartments, in order to monitor cAMP levels within specific microdomains of cells as a consequence of challenge with Apremilast and Roflumilast, which revealed that Apremilast and Roflumilast do regulate different pools of cAMP in cells. In the second part of my thesis I focussed on assessing whether Apremilast and Roflumilast cause differential effects on the PKA phosphorylation state of proteins in cells. I used various biochemical techniques (Western blotting, Substrate kinase arrays and Reverse Phase Protein array and found that Apremilast and Roflumilast do lead to differential PKA substrate phosphorylation. For example I found that Apremilast increases the phosphorylation of Ribosomal Protein S6 at Ser240/244 and Fyn Y530 in the S6 Ribosomal pathway of Rheumatoid Arthritis Synovial fibroblast and HEK293 cells, whereas Roflumilast does not. This data suggests that Apremilast has distinct biological effects from that of Roflumilast and could represent a new therapeutic role for Apremilast in other diseases. In the final part of my thesis, Phage display technology was employed in order to identify any novel binding motifs that associate with PDE4 and to identify sequences that were differentially regulated by the inhibitors in an attempt to find binding motifs that may exist in previously characterised signalling proteins. Petide array technology was then used to confirm binding of specific peptide sequences or motifs. Results showed that Apremilast and Roflumilast can either enhance or decrease the binding of PDE4A4 to specific peptide sequences or motifs that are found in a variety of proteins in the human proteome, most interestingly Ubiquitin-related proteins. The data from this chapter is preliminary but may be used in the discovery of novel binding partners for PDE4 or to provide a new role for PDE inhibition in disease. Therefore the work in this thesis provides a unique snapshot of the complexity of the cAMP signalling system and is the first to directly compare action of the two approved PDE4 inhibitors in a detailed way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The central motif of this work is prediction and optimization in presence of multiple interacting intelligent agents. We use the phrase `intelligent agents' to imply in some sense, a `bounded rationality', the exact meaning of which varies depending on the setting. Our agents may not be `rational' in the classical game theoretic sense, in that they don't always optimize a global objective. Rather, they rely on heuristics, as is natural for human agents or even software agents operating in the real-world. Within this broad framework we study the problem of influence maximization in social networks where behavior of agents is myopic, but complication stems from the structure of interaction networks. In this setting, we generalize two well-known models and give new algorithms and hardness results for our models. Then we move on to models where the agents reason strategically but are faced with considerable uncertainty. For such games, we give a new solution concept and analyze a real-world game using out techniques. Finally, the richest model we consider is that of Network Cournot Competition which deals with strategic resource allocation in hypergraphs, where agents reason strategically and their interaction is specified indirectly via player's utility functions. For this model, we give the first equilibrium computability results. In all of the above problems, we assume that payoffs for the agents are known. However, for real-world games, getting the payoffs can be quite challenging. To this end, we also study the inverse problem of inferring payoffs, given game history. We propose and evaluate a data analytic framework and we show that it is fast and performant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: This study is part of an interactive improvement intervention aimed to facilitate empowerment-based chronic kidney care using data from persons with CKD and their family members. There are many challenges to implementing empowerment-based care, and it is therefore necessary to study the implementation process. The aim of this study was to generate knowledge regarding the implementation process of an improvement intervention of empowerment for those who require chronic kidney care. Methods: A prospective single qualitative case study was chosen to follow the process of the implementation over a two year period. Twelve health care professionals were selected based on their various role(s) in the implementation of the improvement intervention. Data collection comprised of digitally recorded project group meetings, field notes of the meetings, and individual interviews before and after the improvement project. These multiple data were analyzed using qualitative latent content analysis. Results: Two facilitator themes emerged: Moving spirit and Encouragement. The healthcare professionals described a willingness to individualize care and to increase their professional development in the field of chronic kidney care. The implementation process was strongly reinforced by both the researchers working interactively with the staff, and the project group. One theme emerged as a barrier: the Limitations of the organization. Changes in the organization hindered the implementation of the intervention throughout the study period, and the lack of interplay in the organization most impeded the process. Conclusions: The findings indicated the complexity of maintaining a sustainable and lasting implementation over a period of two years. Implementing empowerment-based care was found to be facilitated by the cooperation between all involved healthcare professionals. Furthermore, long-term improvement interventions need strong encouragement from all levels of the organization to maintain engagement, even when it is initiated by the health care professionals themselves.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lipidic mixtures present a particular phase change profile highly affected by their unique crystalline structure. However, classical solid-liquid equilibrium (SLE) thermodynamic modeling approaches, which assume the solid phase to be a pure component, sometimes fail in the correct description of the phase behavior. In addition, their inability increases with the complexity of the system. To overcome some of these problems, this study describes a new procedure to depict the SLE of fatty binary mixtures presenting solid solutions, namely the Crystal-T algorithm. Considering the non-ideality of both liquid and solid phases, this algorithm is aimed at the determination of the temperature in which the first and last crystal of the mixture melts. The evaluation is focused on experimental data measured and reported in this work for systems composed of triacylglycerols and fatty alcohols. The liquidus and solidus lines of the SLE phase diagrams were described by using excess Gibbs energy based equations, and the group contribution UNIFAC model for the calculation of the activity coefficients of both liquid and solid phases. Very low deviations of theoretical and experimental data evidenced the strength of the algorithm, contributing to the enlargement of the scope of the SLE modeling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thoracic injuries in general are of great importance due to their high incidence and high mortality. Thoracic impalement injuries are rare but severe due to the combination of cause, effect and result. This study's primary objective is to report the case of a young man who was impaled by a two-wheeled horse carriage shaft while crashing his motorcycle in a rural zone. An EMT-B ferry was called at the crash scene and a conscious patient was found, sustaining a severe impalement injury to the left hemithorax, suspended over the floor by the axial skeleton with the carriage shaft coming across his left chest. As a secondary objective, a literature review of thoracic impalement injuries is performed. Cases of thoracic impalement injury require unique and individualized care based on injury severity and affected organs. Reported protocols for managing impalement injuries are entirely anecdotal, with no uniformity on impaled patient's approach and management. In penetrating trauma, it is essential not to remove the impaled object, so that possible vascular lesions remain buffered by the object, avoiding major bleeding and exsanguination haemorrhage. Severed impaled thoracic patients should be transferred to a specialist centre for trauma care, as these lesions typically require complex multidisciplinary treatment. High-energy thoracic impalement injuries are rare and hold a high mortality rate, due to the complexity of trauma and associated injuries such as thoracic wall and lung lesions. Modern medicine still seems limited in cases of such seriousness, not always with satisfactory results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.