910 resultados para complexity metrics
Resumo:
Context: Obfuscation is a common technique used to protect software against mali- cious reverse engineering. Obfuscators manipulate the source code to make it harder to analyze and more difficult to understand for the attacker. Although different ob- fuscation algorithms and implementations are available, they have never been directly compared in a large scale study. Aim: This paper aims at evaluating and quantifying the effect of several different obfuscation implementations (both open source and commercial), to help developers and project manager to decide which one could be adopted. Method: In this study we applied 44 obfuscations to 18 subject applications covering a total of 4 millions lines of code. The effectiveness of these source code obfuscations has been measured using 10 code metrics, considering modularity, size and complexity of code. Results: Results show that some of the considered obfuscations are effective in mak- ing code metrics change substantially from original to obfuscated code, although this change (called potency of the obfuscation) is different on different metrics. In the pa- per we recommend which obfuscations to select, given the security requirements of the software to be protected.
Resumo:
Abstract: This study was designed to validate a constructivist learning framework, herein referred to as Accessible Immersion Metrics (AIM), for second language acquisition (SLA) as well as to compare two delivery methods of the same framework. The AIM framework was originally developed in 2009 and is proposed as a “How to” guide for the application of constructivist learning principles to the second language classroom. Piloted in 2010 at Champlain College St-Lambert, the AIM model allows for language learning to occur, free of a fixed schedule, to be socially constructive through the use of task-based assessments and relevant to the learner’s life experience by focusing on the students’ needs rather than on course content.||Résumé : Cette étude a été principalement conçu pour valider un cadre d'apprentissage constructiviste, ci-après dénommé Accessible Immersion Metrics - AIM, pour l'acquisition d'une langue seconde - SLA. Le cadre de l'AIM est proposé comme un mode d'emploi pour l'application des principes constructivistes à l'apprentissage d’une langue seconde. Créé en 2009 par l'auteur, et piloté en 2010 au Collège Champlain St-Lambert, le modèle de l'AIM permet l'apprentissage des langues à se produire, sans horaire fixe et socialement constructive grâce à l'utilisation des évaluations alignées basées sur des tâches pertinentes à l'expérience de vie de l'étudiant en se concentrant sur les besoins des élèves plutôt que sur le contenu des cours.
Resumo:
Metazoans rely on efficient mechanisms to oppose infections caused by pathogens. The immediate and first-line defense mechanism(s) in metazoans, referred to as the innate immune system, is initiated upon recognition of microbial intruders by germline encoded receptors and is executed by a set of rapid effector mechanisms. Adaptive immunity is restricted to vertebrate species and it is controlled and assisted by the innate immune system. Interestingly, most of the basic signaling cascades that regulate the primeval innate defense mechanism(s) have been well conserved during evolution, for instance between humans and the fruit fly, Drosophila melanogaster. Being devoid of adaptive signaling and effector systems, Drosophila has become an established model system for studying pristine innate immune cascades and reactions. In general, an immune response is evoked when microorganisms pass the fruit fly’s physical barriers (e.g. cuticle, epithelial lining of gut and trachea), and it is mainly executed in the hemolymph, the equivalent of the mammalian blood. Innate immunity in the fruit fly consists of a phenoloxidase (PO) response, a cellular response (hemocytes), an antiviral response, and the NF-κB dependent production of antimicrobial peptides referred to as the humoral response. The JAK/STAT and Jun kinase signaling cascades are also implicated in the defence against pathogens.
Severus Snape : The Complexity and Unconventional Heroism of Severus Snape in the Harry Potter Books
Resumo:
Being an evildoer and being evil is not always the same thing; author J.K Rowling’s character Professor Severus Snape from the Harry Potter series is balancing on that very line. Although being unfair and mean to the protagonist Harry Potter all through the series, Professor Snape is revealed as a hero in the seventh book Harry Potter and the Deathly Hallows (2007). This essay focuses on some of the complex psychological reasons as to why Snape acts the way he does towards Harry and why many readers consider him to be just as great a hero as the protagonist. It argues that his difficult upbringing is the cause of his complexity and the series of books are analyzed from a structuralist perspective, using A.J Greimas’ actantial model and Frank Kermode’s theories about endings and plot twists. Snape’s hate for Harry’s father, caused by years of bullying, is examined as well as his love for Harry’s mother. This essay also discusses in what ways Snape’s change of allegiance, brought on by his eternal love for Harry’s mother, is a great aid in defeating the Dark Lord.
Resumo:
OBJECTIVE: Intravoxel incoherent motion (IVIM) is an MRI technique with potential applications in measuring brain tumor perfusion, but its clinical impact remains to be determined. We assessed the usefulness of IVIM-metrics in predicting survival in newly diagnosed glioblastoma. METHODS: Fifteen patients with glioblastoma underwent MRI including spin-echo echo-planar DWI using 13 b-values ranging from 0 to 1000 s/mm2. Parametric maps for diffusion coefficient (D), pseudodiffusion coefficient (D*), and perfusion fraction (f) were generated for contrast-enhancing regions (CER) and non-enhancing regions (NCER). Regions of interest were manually drawn in regions of maximum f and on the corresponding dynamic susceptibility contrast images. Prognostic factors were evaluated by Kaplan-Meier survival and Cox proportional hazards analyses. RESULTS: We found that fCER and D*CER correlated with rCBFCER. The best cutoffs for 6-month survival were fCER>9.86% and D*CER>21.712 x10-3mm2/s (100% sensitivity, 71.4% specificity, 100% and 80% positive predictive values, and 80% and 100% negative predictive values; AUC:0.893 and 0.857, respectively). Treatment yielded the highest hazard ratio (5.484; 95% CI: 1.162-25.88; AUC: 0.723; P = 0.031); fCER combined with treatment predicted survival with 100% accuracy. CONCLUSIONS: The IVIM-metrics fCER and D*CER are promising biomarkers of 6-month survival in newly diagnosed glioblastoma.
Resumo:
This was presented during the 2nd annual Library Research and Innovation Practices at the University of Maryland Libraries, McKeldin Library, on June 8, 2016.
Resumo:
This presentation was one of four during a Mid-Atlantic Regional Archives Conference presentation on April 15, 2016. Digitization of collections can help to improve internal workflows, make materials more accessible, and create new and engaging relationships with users. Laurie Gemmill Arp will discuss the LYRASIS Digitization Collaborative, created to assist institutions with their digitization needs, and how it has worked to help institutions increase connections with users. Robin Pike from the University of Maryland will discuss how they factor requests for access into selection for digitization and how they track the use of digitized materials. Laura Drake Davis of James Madison University will discuss the establishment of a formal digitization program, its impact on users, and the resulting increased use of their collections. Linda Tompkins-Baldwin will discuss Digital Maryland’s partnership with the Digital Public Library of America to provide access to archives held by institutions without a digitization program.
Resumo:
In this paper we extend recent results of Fiorini et al. on the extension complexity of the cut polytope and related polyhedra. We first describe a lifting argument to show exponential extension complexity for a number of NP-complete problems including subset-sum and three dimensional matching. We then obtain a relationship between the extension complexity of the cut polytope of a graph and that of its graph minors. Using this we are able to show exponential extension complexity for the cut polytope of a large number of graphs, including those used in quantum information and suspensions of cubic planar graphs.
Resumo:
Objectives: To analyze the relationship between pharmacotherapeutical complexity and compliance of therapeutic objectives in HIV+ patients on antiretroviral treatment and concomitant dyslipidemia therapy. Materials and methods: A retrospective observational study including HIV patients on stable antiretroviral treatment during the past 6 months, and dyslipidemia treatment between January and December, 2013. The complexity index was calculated with the tool developed by McDonald et al. Other variables analyzed were: age, gender, risk factor of HIV, smoking, alcoholism and drugs, psychiatric disorders, adherence to antiretroviral treatment and lipid lowering drugs, and clinical parameters (HIV viral load, CD4 count, plasma levels of total cholesterol, LDL, HDL, and triglycerides). In order to determine the predictive factors associated with the compliance of therapeutic objectives, univariate analysis was conducted through logistical regression, followed by a multivariate analysis. Results: The study included 89 patients; 56.8% of them met the therapeutic objectives for dyslipidemia. The complexity index was significantly higher (p = 0.02) in those patients who did not reach the objective values (median 51.8 vs. 38.9). Adherence to lipid lowering treatment was significantly associated with compliance of the therapeutic objectives established for dyslipidemia treatment. A 67.0% of patients met the objectives for their antiretroviral treatment; however, the complexity index was not significantly higher (p = 0.06) in those patients who did not meet said objectives. Conclusions: Pharmacotherapeutical complexity represents a key factor in terms of achieving health objectives in HIV+ patients on treatment for dyslipidemia.
Resumo:
La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.
Resumo:
In this work we consider several instances of the following problem: "how complicated can the isomorphism relation for countable models be?"' Using the Borel reducibility framework, we investigate this question with regard to the space of countable models of particular complete first-order theories. We also investigate to what extent this complexity is mirrored in the number of back-and-forth inequivalent models of the theory. We consider this question for two large and related classes of theories. First, we consider o-minimal theories, showing that if T is o-minimal, then the isomorphism relation is either Borel complete or Borel. Further, if it is Borel, we characterize exactly which values can occur, and when they occur. In all cases Borel completeness implies lambda-Borel completeness for all lambda. Second, we consider colored linear orders, which are (complete theories of) a linear order expanded by countably many unary predicates. We discover the same characterization as with o-minimal theories, taking the same values, with the exception that all finite values are possible except two. We characterize exactly when each possibility occurs, which is similar to the o-minimal case. Additionally, we extend Schirrman's theorem, showing that if the language is finite, then T is countably categorical or Borel complete. As before, in all cases Borel completeness implies lambda-Borel completeness for all lambda.
Resumo:
The farm-gate value of extensive beef production from the northern Gulf region of Queensland, Australia, is ~$150 million annually. Poor profitability and declining equity are common issues for most beef businesses in the region. The beef industry relies primarily on native pasture systems and studies continue to report a decline in the condition and productivity of important land types in the region. Governments and Natural Resource Management groups are investing significant resources to restore landscape health and productivity. Fundamental community expectations also include broader environmental outcomes such as reducing beef industry greenhouse gas emissions. Whole-of-business analysis results are presented from 18 extensive beef businesses (producers) to highlight the complex social and economic drivers of management decisions that impact on the natural resource and environment. Business analysis activities also focussed on improving enterprise performance. Profitability, herd performance and greenhouse emission benchmarks are documented and discussed.
Resumo:
The evolution of wireless communication systems leads to Dynamic Spectrum Allocation for Cognitive Radio, which requires reliable spectrum sensing techniques. Among the spectrum sensing methods proposed in the literature, those that exploit cyclostationary characteristics of radio signals are particularly suitable for communication environments with low signal-to-noise ratios, or with non-stationary noise. However, such methods have high computational complexity that directly raises the power consumption of devices which often have very stringent low-power requirements. We propose a strategy for cyclostationary spectrum sensing with reduced energy consumption. This strategy is based on the principle that p processors working at slower frequencies consume less power than a single processor for the same execution time. We devise a strict relation between the energy savings and common parallel system metrics. The results of simulations show that our strategy promises very significant savings in actual devices.
Resumo:
Nitrogen (N) is an essential plant nutrient in maize production, and if considering only natural sources, is often the limiting factor world-wide in terms of a plant’s grain yield. For this reason, many farmers around the world supplement available soil N with synthetic man-made forms. Years of over-application of N fertilizer have led to increased N in groundwater and streams due to leaching and run-off from agricultural sites. In the Midwest Corn Belt much of this excess N eventually makes its way to the Gulf of Mexico leading to eutrophication (increase of phytoplankton) and a hypoxic (reduced oxygen) dead zone. Growing concerns about these types of problems and desire for greater input use efficiency have led to demand for crops with improved N use efficiency (NUE) to allow reduced N fertilizer application rates and subsequently lower N pollution. It is well known that roots are responsible for N uptake by plants, but it is relatively unknown how root architecture affects this ability. This research was conducted to better understand the influence of root complexity (RC) in maize on a plant’s response to N stress as well as the influence of RC on other above-ground plant traits. Thirty-one above-ground plant traits were measured for 64 recombinant inbred lines (RILs) from the intermated B73 & Mo17 (IBM) population and their backcrosses (BCs) to either parent, B73 and Mo17, under normal (182 kg N ha-1) and N deficient (0 kg N ha-1) conditions. The RILs were selected based on results from an earlier experiment by Novais et al. (2011) which screened 232 RILs from the IBM to obtain their root complexity measurements. The 64 selected RILs were comprised of 31 of the lowest complexity RILs (RC1) and 33 of the highest complexity RILs (RC2) in terms of root architecture (characterized as fractal dimensions). The use of the parental BCs classifies the experiment as Design III, an experimental design developed by Comstock and Robinson (1952) which allows for estimation of dominance significance and level. Of the 31 traits measured, 12 were whole plant traits chosen due to their documented response to N stress. The other 19 traits were ear traits commonly measured for their influence on yield. Results showed that genotypes from RC1 and RC2 significantly differ for several above-ground phenotypes. We also observed a difference in the number and magnitude of N treatment responses between the two RC classes. Differences in phenotypic trait correlations and their change in response to N were also observed between the RC classes. RC did not seem to have a strong correlation with calculated NUE (ΔYield/ΔN). Quantitative genetic analysis utilizing the Design III experimental design revealed significant dominance effects acting on several traits as well as changes in significance and dominance level between N treatments. Several QTL were mapped for 26 of the 31 traits and significant N effects were observed across the majority of the genome for some N stress indicative traits (e.g. stay-green). This research and related projects are essential to a better understanding of plant N uptake and metabolism. Understanding these processes is a necessary step in the progress towards the goal of breeding for better NUE crops.