976 resultados para network forensic tools
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
The LMS plays an indisputable role in the majority of the eLearning environments. This eLearning system type is often used for presenting, solving and grading simple exercises. However, exercises from complex domains, such as computer programming, require heterogeneous systems such as evaluation engines, learning objects repositories and exercise resolution environments. The coordination of networks of such disparate systems is rather complex. This work presents a standard approach for the coordination of a network of eLearning systems supporting the resolution of exercises. The proposed approach use a pivot component embedded in the LMS with two roles: provide an exercise resolution environment and coordinate the communication between the LMS and other systems exposing their functions as web services. The integration of the pivot component with the LMS relies on the Learning Tools Interoperability. The validation of this approach is made through the integration of the component with LMSs from two vendors.
Resumo:
The restructuring of electricity markets, conducted to increase the competition in this sector, and decrease the electricity prices, brought with it an enormous increase in the complexity of the considered mechanisms. The electricity market became a complex and unpredictable environment, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. Software tools became, therefore, essential to provide simulation and decision support capabilities, in order to potentiate the involved players’ actions. This paper presents the development of a metalearner, applied to the decision support of electricity markets’ negotiation entities. The proposed metalearner executes a dynamic artificial neural network to create its own output, taking advantage on several learning algorithms implemented in ALBidS, an adaptive learning system that provides decision support to electricity markets’ players. The proposed metalearner considers different weights for each strategy, depending on its individual quality of performance. The results of the proposed method are studied and analyzed in scenarios based on real electricity markets’ data, using MASCEM - a multi-agent electricity market simulator that simulates market players’ operation in the market.
Resumo:
Dissertation to obtain the degree of Doctor in Electrical and Computer Engineering, specialization of Collaborative Networks
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Thesis submitted in fulfilment of the requirements for the Degree of Master of Science in Computer Science
Resumo:
PhD Thesis in Bioengineering
Resumo:
The nitrogen dioxide is a primary pollutant, regarded for the estimation of the air quality index, whose excessive presence may cause significant environmental and health problems. In the current work, we suggest characterizing the evolution of NO2 levels, by using geostatisti- cal approaches that deal with both the space and time coordinates. To develop our proposal, a first exploratory analysis was carried out on daily values of the target variable, daily measured in Portugal from 2004 to 2012, which led to identify three influential covariates (type of site, environment and month of measurement). In a second step, appropriate geostatistical tools were applied to model the trend and the space-time variability, thus enabling us to use the kriging techniques for prediction, without requiring data from a dense monitoring network. This method- ology has valuable applications, as it can provide accurate assessment of the nitrogen dioxide concentrations at sites where either data have been lost or there is no monitoring station nearby.
Resumo:
Tese de Doutoramento em Ciências da Saúde.
Resumo:
PhD thesis in Biomedical Engineering
Resumo:
This study utilised recent developments in forensic aromatic hydrocarbon fingerprint analysis to characterise and identify specific biogenic, pyrogenic and petrogenic contamination. The fingerprinting and data interpretation techniques discussed include the recognition of: The distribution patterns of hydrocarbons (alkylated naphthalene, phenanthrene, dibenzothiophene, fluorene, chrysene and phenol isomers), • Analysis of “source-specific marker” compounds (individual saturated hydrocarbons, including n-alkanes (n-C5 through 0-C40) • Selected benzene, toluene, ethylbenzene and xylene isomers (BTEX), • The recalcitrant isoprenoids; pristane and phytane and • The determination of diagnostic ratios of specific petroleum / non-petroleum constituents, and the application of various statistical and numerical analysis tools. An unknown sample from the Irish Environmental Protection Agency (EPA) for origin characterisation was subjected to analysis by gas chromatography utilising both flame ionisation and mass spectral detection techniques in comparison to known reference materials. The percentage of the individual Polycyclic Aromatic Hydrocarbons (PAIIs) and biomarker concentrations in the unknown sample were normalised to the sum of the analytes and the results were compared with the corresponding results with a range of reference materials. In addition, to the determination of conventional diagnostic PAH and biomarker ratios, a number of “source-specific markers” isomeric PAHs within the same alkylation levels were determined, and their relative abundance ratios were computed in order to definitively identify and differentiate the various sources. Statistical logarithmic star plots were generated from both sets of data to give a pictorial representation of the comparison between the unknown sample and reference products. The study successfully characterised the unknown sample as being contaminated with a “coal tar” and clearly demonstrates the future role of compound ratio analysis (CORAT) in the identification of possible source contaminants.
Resumo:
Abstract Clinical decision-making requires synthesis of evidence from literature reviews focused on a specific theme. Evidence synthesis is performed with qualitative assessments and systematic reviews of randomized clinical trials, typically covering statistical pooling with pairwise meta-analyses. These methods include adjusted indirect comparison meta-analysis, network meta-analysis, and mixed-treatment comparison. These tools allow synthesis of evidence and comparison of effectiveness in cardiovascular research.
Resumo:
The first scientific meeting of the newly established European SYSGENET network took place at the Helmholtz Centre for Infection Research (HZI) in Braunschweig, April 7-9, 2010. About 50 researchers working in the field of systems genetics using mouse genetic reference populations (GRP) participated in the meeting and exchanged their results, phenotyping approaches, and data analysis tools for studying systems genetics. In addition, the future of GRP resources and phenotyping in Europe was discussed.
Implementation of IPM programs on European greenhouse tomato production areas: Tools and constraints
Resumo:
Whiteflies and whitefly-transmitted viruses are some of the major constraints on European tomato production. The main objectives of this study were to: identify where and why whiteflies are a major limitation on tomato crops; collect information about whiteflies and associated viruses; determine the available management tools; and identify key knowledge gaps and research priorities. This study was conducted within the framework of ENDURE (European Network for Durable Exploitation of Crop Protection Strategies). Two whitefly species are the main pests of tomato in Europe: Bemisia tabaci and Trialeurodes vaporariorum. Trialeurodes vaporariorum is widespread to all areas where greenhouse industry is present, and B. tabaci has invaded, since the early 1990’s, all the subtropical and tropical areas. Biotypes B and Q of B. tabaci are widespread and especially problematic. Other key tomato pests are Aculops lycopersici, Helicoverpa armigera, Frankliniella occidentalis, and leaf miners. Tomato crops are particularly susceptible to viruses causingTomato yellow leaf curl disease (TYLCD). High incidences of this disease are associated to high pressure of its vector, B. tabaci. The ranked importance of B. tabaci established in this study correlates with the levels of insecticide use, showing B. tabaci as one of the principal drivers behind chemical control. Confirmed cases of resistance to almost all insecticides have been reported. Integrated Pest Management based on biological control (IPM-BC) is applied in all the surveyed regions and identified as the strategy using fewer insecticides. Other IPM components include greenhouse netting and TYLCD-tolerant tomato cultivars. Sampling techniques differ between regions, where decisions are generally based upon whitefly densities and do not relate to control strategies or growing cycles. For population monitoring and control, whitefly species are always identified. In Europe IPM-BC is the recommended strategy for a sustainable tomato production. The IPM-BC approach is mainly based on inoculative releases of the parasitoids Eretmocerus mundus and Encarsia formosa and/or the polyphagous predators Macrolophus caliginosus and Nesidiocoris tenuis. However, some limitations for a wider implementation have been identified: lack of biological solutions for some pests, costs of beneficials, low farmer confidence, costs of technical advice, and low pest injury thresholds. Research priorities to promote and improve IPM-BC are proposed on the following domains: (i) emergence and invasion of new whitefly-transmitted viruses; (ii) relevance of B. tabaci biotypes regarding insecticide resistance; (iii) biochemistry and genetics of plant resistance; (iv) economic thresholds and sampling techniques of whiteflies for decision making; and (v) conservation and management of native whitefly natural enemies and improvement of biological control of other tomato pests.
Resumo:
In the recent years most libraries have focused on mass digitization programs and keeping electronic born documents, showing and organizing them in a repository. While those repositories have evolved to a much more manageable systems focusing on the user expectations and introducing web 2.0 tools, digital preservation is still in the to-do list of most of them. There is quite a lot of studies focused on preservation and some complex models exist, unfortunately, very few practical systems are running and its quite difficult for a library to get involved in a solution already tested by others. The CBUC (Consortium of University Catalan Libraries) runs TDX, an ETD repository now keeping more than 10.000 full text thesis from any of the 12 university members. After 10 years running TDX a solid preservation system was needed to ensure every thesis would be kept as it was regardless what happens to the repository. The perfect solution was found in the MetaArchive cooperative, this is the effort of many insitutions to keep a copy of each other content through a newtwork using the LOCKSS software as a mechanism to keep track of any change. The presentation will shortly introduce what TDX and MetaArchive is but will, in a practical way, show how the LOCKSS network for presrervation works. Finally a summary of the benefits of the overall experience will be shown.