938 resultados para source analysis
Resumo:
Ce travail s’inscrit dans le champ des recherches concernant les pratiques inclusives en milieu scolaire ordinaire dans l’enseignement primaire. En France, le système éducatif propose de scolariser les élèves à besoins éducatifs particuliers soit en classe ordinaire, soit en classe spécialisée, bien que les gouvernements valorisent l’accueil en milieu ordinaire depuis la loi de 2005. Or, ceci questionne les pratiques des acteurs de l’école sur la prise en charge de ces élèves. Partant des travaux montrant que les enseignants utilisant l’évaluation formative gèrent mieux la diversité des élèves, nous étudions ici dans quelle mesure cette fonction de l’évaluation aiderait les élèves présentant des besoins éducatifs particuliers à acquérir des connaissances grâce aux feedbacks émis lors d’évaluations orales et de corrections collectives. L’analyse des données recueillies à l’aide d’entretiensavec des enseignants et d’observations d’élèves fait ressortir les attitudes des acteurs, les interactions et les régulations. (DIPF/Orig.)
Resumo:
One of the most challenging task underlying many hyperspectral imagery applications is the spectral unmixing, which decomposes a mixed pixel into a collection of reectance spectra, called endmember signatures, and their corresponding fractional abundances. Independent Component Analysis (ICA) have recently been proposed as a tool to unmix hyperspectral data. The basic goal of ICA is to nd a linear transformation to recover independent sources (abundance fractions) given only sensor observations that are unknown linear mixtures of the unobserved independent sources. In hyperspectral imagery the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be independent. This paper address hyperspectral data source dependence and its impact on ICA performance. The study consider simulated and real data. In simulated scenarios hyperspectral observations are described by a generative model that takes into account the degradation mechanisms normally found in hyperspectral applications. We conclude that ICA does not unmix correctly all sources. This conclusion is based on the a study of the mutual information. Nevertheless, some sources might be well separated mainly if the number of sources is large and the signal-to-noise ratio (SNR) is high.
Resumo:
Résumé : Dans une ère de concurrence économique accrue, les organisations éprouvent de plus en plus le besoin de mesurer le rendement de leurs employés, y compris celui de leurs dirigeants. À cette fin, l'évaluation demeure un outil de gestion privilégié. Au nombre des systèmes d'évaluation existants, le feed-back multi-source (FMS) — ou feed-back 360°— est en progression. Malheureusement, on ne connaît pas encore très bien l'incidence de ce type de système, la littérature étant plutôt muette sur ce qu'il donne concrètement et, particulièrement, sur la réaction des évalués subséquemment à l'obtention de leur rapport d'évaluation. Il ressort néanmoins de certaines études que les dirigeants, et surtout, les chefs se sentent laissés à eux-mêmes quant à l'évaluation de leurs compétences. Il y a quelques années, à la demande du Groupement des chefs d'entreprise du Québec (GCEQ), un instrument de type multi-source a été conçu par le Laboratoire de recherche sur la performance des entreprises (LaRePe), afin de mesurer spécifiquement la performance des chefs d'entreprise comme leader. À ce stade-ci, les promoteurs désirent mieux comprendre l'incidence de l'utilisation de leur outil : le PDG-Leadership. Pour combler les lacunes de la littérature, et aussi pour répondre au besoin du GCEQ, la présente thèse porte sur la réaction des chefs à la suite de la réception de leur rapport d'évaluation. L'objet de la recherche est double : il s'agit d'examiner les variables qui influencent le fait que les évalués entreprennent des actions à la suite de leur feed-back (considération théorique) et, d'autre part, de connaître davantage ces actions entreprises bref, ce que le système de feed-back multi-source XFMS) donne vraiment (considération pratique). Afin de mener à bien la recherche, une résidence a été réalisée. Elle a fourni le contexte pour élaborer un questionnaire d'enquête s'appliquant particulièrement aux chefs d'entreprise. L'enquête a permis de rejoindre 351 dirigeants ayant été évalués au moins une fois par l'entremise du PDG-Leadership. De ce nombre, 87 répondants, membres du Groupement se sont manifestés. Le cadre conceptuel utilisé consiste en une adaptation du modèle proposé par Smither, London et Reilly (2005a). comporte sept variables, desquelles ont été tirées cinq hypothèses de recherche. Quatre hypothèses ont été rejetées alors qu'une autre ne s'est avérée supportée que pour le groupe constitué des femmes faisant partie de l'échantillon. De plus, il est intéressant de constater que ce n'est pas le feed-back (rapport) qui déclenche l'acceptation puis les actions, mais une attitude personnelle représentée par la possibilité d'un changement perçue (V4). Chez les chefs, il ne se produit donc pas de réaction en chaîne comme le suppose le modèle théorique utilisé. Il semble que ce soit plutôt la possibilité de changement perçu qui est à la base du fait d'entreprendre des actions, laquelle s'apparente au sentiment d'auto-efficacité défini par Bandura (2007). Les données recueillies auront aussi servies à générer de nouvelles connaissances et à faire ressortir une liste de 112 actions que les chefs disent avoir engagées à la suite de l'obtention de leur rapport d'évaluation. Cette liste a permis de faire une catégorisation des actions posées. Les actions qu'ils entreprennent sont toutefois davantage dirigées vers l'amélioration de l'organisation que vers leur propre amélioration. Il s'agit là, d'une des contributions de la présente thèse.||Abstract : In a context of intense economic competition, organizations are increasingly using instruments of performance evaluation. The multi-source feedback or 360 [degrees] is one of those. The literature seems still silent on what type of evaluation is really about the reaction it generates among evaluated. In response to a request from the Groupement des chefs d'entreprise du Québec (GCEQ), a System of multi-source assessment was designed by the Laboratoire de recherche sur la performance des entreprises (LaRePe). The PDG-Leadership, specifically used to measure the skills of managers of SMLs as a leader. After some years of use, developers want to better understand its impact in order to improve it and make it even better. To address these theoretical and practical considérations, a survey was conducted among 87 business leaders from Quebec who had already been assessed using this tool. This research bas the purpose, the validation of a preliminary model proposed by Smither, London, and Reilly, 2005a, to examine the variables that influence that evaluated undertake actions as a result of their feedback and the other, to know these actions, in short, that the System of feed-back multi-source (FMS) really. From the analysis of data collected, a list of 112 shares was established. In turn, this led to a categorization of actions taken. Although the FMS system is effective, it should be noted that entrepreneurs seem to react differently from other catégories assessed.
Resumo:
Background: Digital forensics is a rapidly expanding field, due to the continuing advances in computer technology and increases in data stage capabilities of devices. However, the tools supporting digital forensics investigations have not kept pace with this evolution, often leaving the investigator to analyse large volumes of textual data and rely heavily on their own intuition and experience. Aim: This research proposes that given the ability of information visualisation to provide an end user with an intuitive way to rapidly analyse large volumes of complex data, such approached could be applied to digital forensics datasets. Such methods will be investigated; supported by a review of literature regarding the use of such techniques in other fields. The hypothesis of this research body is that by utilising exploratory information visualisation techniques in the form of a tool to support digital forensic investigations, gains in investigative effectiveness can be realised. Method:To test the hypothesis, this research examines three different case studies which look at different forms of information visualisation and their implementation with a digital forensic dataset. Two of these case studies take the form of prototype tools developed by the researcher, and one case study utilises a tool created by a third party research group. A pilot study by the researcher is conducted on these cases, with the strengths and weaknesses of each being drawn into the next case study. The culmination of these case studies is a prototype tool which was developed to resemble a timeline visualisation of the user behaviour on a device. This tool was subjected to an experiment involving a class of university digital forensics students who were given a number of questions about a synthetic digital forensic dataset. Approximately half were given the prototype tool, named Insight, to use, and the others given a common open-source tool. The assessed metrics included: how long the participants took to complete all tasks, how accurate their answers to the tasks were, and how easy the participants found the tasks to complete. They were also asked for their feedback at multiple points throughout the task. Results:The results showed that there was a statistically significant increase in accuracy for one of the six tasks for the participants using the Insight prototype tool. Participants also found completing two of the six tasks significantly easier when using the prototype tool. There were no statistically significant different difference between the completion times of both participant groups. There were no statistically significant differences in the accuracy of participant answers for five of the six tasks. Conclusions: The results from this body of research show that there is evidence to suggest that there is the potential for gains in investigative effectiveness when information visualisation techniques are applied to a digital forensic dataset. Specifically, in some scenarios, the investigator can draw conclusions which are more accurate than those drawn when using primarily textual tools. There is also evidence so suggest that the investigators found these conclusions to be reached significantly more easily when using a tool with a visual format. None of the scenarios led to the investigators being at a significant disadvantage in terms of accuracy or usability when using the prototype visual tool over the textual tool. It is noted that this research did not show that the use of information visualisation techniques leads to any statistically significant difference in the time taken to complete a digital forensics investigation.
Resumo:
Carbonic anhydrases are enzymes that are ubiquitously found in all organisms that are engaged in catalyzing the hydration of carbon dioxide to form bicarbonate and proton and vice versa. They are crucial in the process of respiration, bone resorption, pH regulation, ion transport, and photosynthesis in plants. Out of the five classes of carbonic anhydrase α, β, γ, δ, ζ this study focused in the α carbonic anhydrases. This class of CAs constitute of 16 subfamilies in mammals that include 3 non-active enzymes known as Carbonic Anhydrase Related Proteins. The inactiveness of these enzymes is due to the loss of one or more Histidine residues in the active site. This thesis was conducted based on the aim of studying evolutionary analysis of carbonic anhydrase sequences from organisms spanning from the Cambrian age. It was carried out in two phases. The first phase was the sequence collection, which involved many biological sequence databases as a source. The scope of this segment included sequence alignments and analysis of the sequence manually and in an automated form incorporating few analysis tools. The second Phase was phylogenetic analysis and exploring the subcellular location of the proteins, which was key for the evolutionary analysis. Through the medium of the methods conducted with respect to the phases mentioned above, it was possible to accomplish the desired result. Certain thought-provoking sequences were come across and analyzed thoroughly. Whereas, Phylogenetics showed interesting results to bolster previous findings and new findings as well which lay bedrock for future intensified studies.
Resumo:
Resource management policies are frequently designed and planned to target specific needs of particular sectors, without taking into account the interests of other sectors who share the same resources. In a climate of resource depletion, population growth, increase in energy demand and climate change awareness, it is of great importance to promote the assessment of intersectoral linkages and, by doing so, understand their effects and implications. This need is further augmented when common use of resources might not be solely relevant at national level, but also when the distribution of resources ranges over different nations. This dissertation focuses on the study of the energy systems of five south eastern European countries, which share the Sava River Basin, using a water-food(agriculture)-energy nexus approach. In the case of the electricity generation sector, the use of water is essential for the integrity of the energy systems, as the electricity production in the riparian countries relies on two major technologies dependent on water resources: hydro and thermal power plants. For example, in 2012, an average of 37% of the electricity production in the SRB countries was generated by hydropower and 61% in thermal power plants. Focusing on the SRB, in terms of existing installed capacities, the basin accommodates close to a tenth of all hydropower capacity while providing water for cooling to 42% of the net capacity of thermal power currently in operation in the basin. This energy-oriented nexus study explores the dependency on the basin’s water resources of the energy systems in the region for the period between 2015 and 2030. To do so, a multi-country electricity model was developed to provide a quantification ground to the analysis, using the open-source software modelling tool OSeMOSYS. Three main areas are subject to analysis: first, the impact of energy efficiency and renewable energy strategies in the electricity generation mix; secondly, the potential impacts of climate change under a moderate climate change projection scenario; and finally, deriving from the latter point, the cumulative impact of an increase in water demand in the agriculture sector, for irrigation. Additionally, electricity trade dynamics are compared across the different scenarios under scrutiny, as an effort to investigate the implications of the aforementioned factors in the electricity markets in the region.
Resumo:
While humans can easily segregate and track a speaker's voice in a loud noisy environment, most modern speech recognition systems still perform poorly in loud background noise. The computational principles behind auditory source segregation in humans is not yet fully understood. In this dissertation, we develop a computational model for source segregation inspired by auditory processing in the brain. To support the key principles behind the computational model, we conduct a series of electro-encephalography experiments using both simple tone-based stimuli and more natural speech stimulus. Most source segregation algorithms utilize some form of prior information about the target speaker or use more than one simultaneous recording of the noisy speech mixtures. Other methods develop models on the noise characteristics. Source segregation of simultaneous speech mixtures with a single microphone recording and no knowledge of the target speaker is still a challenge. Using the principle of temporal coherence, we develop a novel computational model that exploits the difference in the temporal evolution of features that belong to different sources to perform unsupervised monaural source segregation. While using no prior information about the target speaker, this method can gracefully incorporate knowledge about the target speaker to further enhance the segregation.Through a series of EEG experiments we collect neurological evidence to support the principle behind the model. Aside from its unusual structure and computational innovations, the proposed model provides testable hypotheses of the physiological mechanisms of the remarkable perceptual ability of humans to segregate acoustic sources, and of its psychophysical manifestations in navigating complex sensory environments. Results from EEG experiments provide further insights into the assumptions behind the model and provide motivation for future single unit studies that can provide more direct evidence for the principle of temporal coherence.
Resumo:
SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
We report the suitability of an Einstein-Podolsky-Rosen entanglement source for Gaussian continuous-variable quantum key distribution at 1550 nm. Our source is based on a single continuous-wave squeezed vacuum mode combined with a vacuum mode at a balanced beam splitter. Extending a recent security proof, we characterize the source by quantifying the extractable length of a composable secure key from a finite number of samples under the assumption of collective attacks. We show that distances in the order of 10 km are achievable with this source for a reasonable sample size despite the fact that the entanglement was generated including a vacuum mode. Our security analysis applies to all states having an asymmetry in the field quadrature variances, including those generated by superposition of two squeezed modes with different squeezing strengths.
Resumo:
Crowdfunding (CF) is an increasingly attractive source to fund social projects. However, to our best knowledge, the study of CF for social purposes has remained largely unexplored in the literature. This research envisages a detailed examination of the role of CF on the early-stage of the social projects at regional level. By comparing the characteristics of the projects available in the Portuguese Social Stock Exchange (PSSE) platform with others that did not use this source of financial support, we explore its role on regional development. The results we got show that, in most cases, both PSSE and Non-Governmental Organizations projects complemented the services offered by the State or by the private sector. Furthermore, about a quarter of the projects present in PSSE operated in areas that were not being addressed neither by the services offered by the State nor by the ones of the private sector. The results attained show that more recent social ventures have a greater propensity to use PSSE. The same is find in those organizations which work closely with the target audience. We also observed that the use of PSSE was correlated with the geographical scope of the Social Venture. The circumstance of having the social organization acting at a local or regional level seems to be strongly associated with the possibility of using social crowdfunding for financing social projects.
Resumo:
Social innovation is a critical factor for the conception of new strategies to deal with increasingly complex social problems. Many of these initiatives are pursued at the local level and are based on the dynamic capabilities of a given territory. Through the analysis of the Cooperative Terra Chã, we assess whether dynamic capabilities of a territory can generate opportunities for social innovation and how they can be exploited by local communities. We observe that by using a integrated strategy for the management of the capabilities of a territory, new social ventures are able to cope with severe social issues that are not being adequately addressed by other stakeholders.
Resumo:
An accurate amplified fragment length polymorphism (AFLP) method, including three primer sets for the selective amplification step, was developed to display the phylogenetic position of Photobacterium isolates collected from salmon products. This method was efficient for discriminating the three species Photobacterium phosphoreum, Photobacterium iliopiscarium and Photobacterium kishitanii, until now indistinctly gathered in the Photobacterium phosphoreum species group known to be strongly responsible for seafood spoilage. The AFLP fingerprints enabled the isolates to be separated into two main clusters that, according to the type strains, were assigned to the two species P. phosphoreum and P. iliopiscarium. P. kishitanii was not found in the collection. The accuracy of the method was validated by using gyrB-gene sequencing and luxA-gene PCR amplification, which confirmed the species delineation. Most of the isolates of each species were clonally distinct and even those that were isolated from the same source showed some diversity. Moreover, this AFLP method may be an excellent tool for genotyping isolates in bacterial communities and for clarifying our knowledge of the role of the different members of the Photobacterium species group in seafood spoilage.
Resumo:
Cultivation of chilling-tolerant ornamental crops at lower temperature could reduce the energy demands of heated greenhouses. To provide a better understanding of how sub-optimal temperatures (12 degrees C vs. 16 degrees C) affect growth of the sensitive Petunia hybrida cultivar 'SweetSunshine Williams', the transcriptome, carbohydrate metabolism, and phytohormone homeostasis were monitored in aerial plant parts over 4 weeks by use of a microarray, enzymatic assays and GC-MS/MS. The data revealed three consecutive phases of chilling response. The first days were marked by a strong accumulation of sugars, particularly in source leaves, preferential up-regulation of genes in the same tissue and down-regulation of several genes in the shoot apex, especially those involved in the abiotic stress response. The midterm phase featured a partial normalization of carbohydrate levels and gene expression. After 3 weeks of chilling exposure, a new stabilized balance was established. Reduced hexose levels in the shoot apex, reduced ratios of sugar levels between the apex and source leaves and a higher apical sucrose/hexose ratio, associated with decreased activity and expression of cell wall invertase, indicate that prolonged chilling induced sugar accumulation in source leaves at the expense of reduced sugar transport to and reduced sucrose utilization in the shoot. This was associated with reduced levels of indole-3-acetic acid and abscisic acid in the apex and high numbers of differentially, particularly up-regulated genes, especially in the source leaves, including those regulating histones, ethylene action, transcription factors, and a jasmonate-ZIM-domain protein. Transcripts of one Jumonji C domain containing protein and one expansin accumulated in source leaves throughout the chilling period. The results reveal a dynamic and complex disturbance of plant function in response to mild chilling, opening new perspectives for the comparative analysis of differently tolerant cultivars.