945 resultados para digital forensic tool testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hazardous materials are substances that, if not regulated, can pose a threat to human populations and their environmental health, safety or property when transported in commerce. About 1.5 million tons of hazardous material shipments are transported by truck in the US annually, with a steady increase of approximately 5% per year. The objective of this study was to develop a routing tool for hazardous material transport in order to facilitate reduced environmental impacts and less transportation difficulties, yet would also find paths that were still compelling for the shipping carriers as a matter of trucking cost. The study started with identification of inhalation hazard impact zones and explosion protective areas around the location of hypothetical hazardous material releases, considering different parameters (i.e., chemicals characteristics, release quantities, atmospheric condition, etc.). Results showed that depending on the quantity of release, chemical, and atmospheric stability (a function of wind speed, meteorology, sky cover, time and location of accidents, etc.) the consequence of these incidents can differ. The study was extended by selection of other evaluation criteria for further investigation because health risk as an evaluation criterion would not be the only concern in selection of routes. Transportation difficulties (i.e., road blockage and congestion) were incorporated as important factor due to their indirect impact/cost on the users of transportation networks. Trucking costs were also considered as one of the primary criteria in selection of hazardous material paths; otherwise the suggested routes would have not been convincing for the shipping companies. The last but not least criterion was proximity of public places to the routes. The approach evolved from a simple framework to a complicated and efficient GIS-based tool able to investigate transportation networks of any given study area, and capable of generating best routing options for cargos. The suggested tool uses a multi-criteria-decision-making method, which considers the priorities of the decision makers in choosing the cargo routes. Comparison of the routing options based on each criterion and also the overall suitableness of the path in regards to all the criteria (using a multi-criteria-decision-making method) showed that using similar tools as the one proposed by this study can provide decision makers insights in the area of hazardous material transport. This tool shows the probable consequences of considering each path in a very easily understandable way; in the formats of maps and tables, which makes the tradeoffs of costs and risks considerably simpler, as in some cases slightly compromising on trucking cost may drastically decrease the probable health risk and/or traffic difficulties. This will not only be rewarding to the community by making cities safer places to live, but also can be beneficial to shipping companies by allowing them to advertise as environmental friendly conveyors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The elemental analysis of soil is useful in forensic and environmental sciences. Methods were developed and optimized for two laser-based multi-element analysis techniques: laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and laser-induced breakdown spectroscopy (LIBS). This work represents the first use of a 266 nm laser for forensic soil analysis by LIBS. Sample preparation methods were developed and optimized for a variety of sample types, including pellets for large bulk soil specimens (470 mg) and sediment-laden filters (47 mg), and tape-mounting for small transfer evidence specimens (10 mg). Analytical performance for sediment filter pellets and tape-mounted soils was similar to that achieved with bulk pellets. An inter-laboratory comparison exercise was designed to evaluate the performance of the LA-ICP-MS and LIBS methods, as well as for micro X-ray fluorescence (μXRF), across multiple laboratories. Limits of detection (LODs) were 0.01-23 ppm for LA-ICP-MS, 0.25-574 ppm for LIBS, 16-4400 ppm for µXRF, and well below the levels normally seen in soils. Good intra-laboratory precision (≤ 6 % relative standard deviation (RSD) for LA-ICP-MS; ≤ 8 % for µXRF; ≤ 17 % for LIBS) and inter-laboratory precision (≤ 19 % for LA-ICP-MS; ≤ 25 % for µXRF) were achieved for most elements, which is encouraging for a first inter-laboratory exercise. While LIBS generally has higher LODs and RSDs than LA-ICP-MS, both were capable of generating good quality multi-element data sufficient for discrimination purposes. Multivariate methods using principal components analysis (PCA) and linear discriminant analysis (LDA) were developed for discriminations of soils from different sources. Specimens from different sites that were indistinguishable by color alone were discriminated by elemental analysis. Correct classification rates of 94.5 % or better were achieved in a simulated forensic discrimination of three similar sites for both LIBS and LA-ICP-MS. Results for tape-mounted specimens were nearly identical to those achieved with pellets. Methods were tested on soils from USA, Canada and Tanzania. Within-site heterogeneity was site-specific. Elemental differences were greatest for specimens separated by large distances, even within the same lithology. Elemental profiles can be used to discriminate soils from different locations and narrow down locations even when mineralogy is similar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis was to examine the mediating effects of job-related negative emotions on the relationship between workplace aggression and outcomes. Additionally, the moderating effects of workplace social support and intensity of workplace aggression are considered. A total 321 of working individuals participated through an online survey. The results of this thesis suggest that job-related negative emotions are a mediator of the relationship between workplace aggression and outcomes, with full and partial mediation supported. Workplace social support was found to be a buffering variable in the relationship between workplace aggression and outcomes, regardless of the source of aggression (supervisor or co-worker) or the source of the social support. Finally, intensity of aggression was found to be a strong moderator of the relationship between workplace aggression and outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology has an important role in children's lives and education. Based on several projects developed with ICT, both in Early Childhood Education (3-6 years old) and Primary Education (6-10 years old), since 1997, the authors argue that research and educational practices need to "go outside", addressing ways to connect technology with outdoor education. The experience with the projects and initiatives developed supported a conceptual framework, developed and discussed with several partners throughout the years and theoretically informed. Three main principles or axis have emerged: strengthening Children's Participation, promoting Critical Citizenship and establishing strong Connections to Pedagogy and Curriculum. In this paper, those axis will be presented and discussed in relation to the challenge posed by Outdoor Education to the way ICT in Early Childhood and Primary Education is understood, promoted and researched. The paper is exploratory, attempting to connect theoretical and conceptual contributions from Early Childhood Pedagogy with contributions from ICT in Education. The research-based knowledge available is still scarce, mostly based on studies developed with other purposes. The paper, therefore, focus the connections and interpellations between concepts established through the theoretical framework and draws on the almost 20 years of experience with large and small scale action-research projects of ICT in schools. The more recent one is already testing the conceptual framework by supporting children in non-formal contexts to explore vineyards and the cycle of wine production with several ICT tools. Approaching Outdoor Education as an arena where pedagogical and cultural dimensions influence decisions and practices, the paper tries to argue that the three axis are relevant in supporting a stronger connection between technology and the outdoor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the role of visual literacy in learning biology. Biology teachers promote the use of digital images as a learning tool for two reasons: because biology is the most visual of the sciences, and the use of imagery is becoming increasingly important with the advent of bioinformatics; and because studies indicate that this current generation of teenagers have a cognitive structure that is formed through exposure to digital media. On the other hand, there is concern that students are not being exposed enough to the traditional methods of processing biological information - thought to encourage left-brain sequential thinking patterns. Theories of Embodied Cognition point to the importance of hand-drawing for proper assimilation of knowledge, and theories of Multiple Intelligences suggest that some students may learn more easily using traditional pedagogical tools. To test the claim that digital learning tools enhance the acquisition of visual literacy in this generation of biology students, a learning intervention was carried out with 33 students enrolled in an introductory college biology course. The study compared learning outcomes following two types of learning tools. One learning tool was a traditional drawing activity, and the other was an interactive digital activity carried out on a computer. The sample was divided into two random groups, and a crossover design was implemented with two separate interventions. In the first intervention students learned how to draw and label a cell. Group 1 learned the material by computer and Group 2 learned the material by hand-drawing. In the second intervention, students learned how to draw the phases of mitosis, and the two groups were inverted. After each learning activity, students were given a quiz on the material they had learned. Students were also asked to self-evaluate their performance on each quiz, in an attempt to measure their level of metacognition. At the end of the study, they were asked to fill out a questionnaire that was used to measure the level of task engagement the students felt towards the two types of learning activities. In this study, following the first testing phase, the students who learned the material by drawing had a significantly higher average grade on the associated quiz compared to that of those who learned the material by computer. The difference was lost with the second “cross-over” trial. There was no correlation for either group between the grade the students thought they had earned through self-evaluation, and the grade that they received. In terms of different measures of task engagement, there were no significant differences between the two groups. One finding from the study showed a positive correlation between grade and self-reported time spent playing video games, and a negative correlation between grade and self-reported interest in drawing. This study provides little evidence to support claims that the use of digital tools enhances learning, but does provide evidence to support claims that drawing by hand is beneficial for learning biological images. However, the small sample size, limited number and type of learning tasks, and the indirect means of measuring levels of metacognition and task engagement restrict generalisation of these conclusions. Nevertheless, this study indicates that teachers should not use digital learning tools to the exclusion of traditional drawing activities: further studies on the effectiveness of these tools are warranted. Students in this study commented that the computer tool seemed more accurate and detailed - even though the two learning tools carried identical information. Thus there was a mismatch between the perception of the usefulness of computers as a learning tool and the reality, which again points to the need for an objective assessment of their usefulness. Students should be given the opportunity to try out a variety of traditional and digital learning tools in order to address their different learning preferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquid chromatography coupled with mass spectrometry is one of the most powerful tools in the toxicologist’s arsenal to detect a wide variety of compounds from many different matrices. However, the huge number of potentially abused substances and new substances especially designed as intoxicants poses a problem in a forensic toxicology setting. Most methods are targeted and designed to cover a very specific drug or group of drugs while many other substances remain undetected. High resolution mass spectrometry, more specifically time-of-flight mass spectrometry, represents an extremely powerful tool in analysing a multitude of compounds not only simultaneously but also retroactively. The data obtained through the time-of-flight instrument contains all compounds made available from sample extraction and chromatography, which can be processed at a later time with an improved library to detect previously unrecognised compounds without having to analyse the respective sample again. The aim of this project was to determine the utility and limitations of time-of-flight mass spectrometry as a general and easily expandable screening method. The resolution of time-of-flight mass spectrometry allows for the separation of compounds with the same nominal mass but distinct exact masses without the need to separate them chromatographically. To simulate the wide variety of potentially encountered drugs in such a general screening method, seven drugs (morphine, cocaine, zolpidem, diazepam, amphetamine, MDEA and THC) were chosen to represent this variety in terms of mass, properties and functional groups. Consequently, several liquid-liquid and solid phase extractions were applied to urine samples to determine the most general suitable and unspecific extraction. Chromatography was optimised by investigating the parameters pH, concentration, organic solvent and gradient of the mobile phase to improve data obtained by the time-of-flight instrument. The resulting method was validated as a qualitative confirmation/identification method. Data processing was automated using the software TargetAnalysis, which provides excellent analyte recognition according to retention time, exact mass and isotope pattern. The recognition of isotope patterns allows excellent recognition of analytes even in interference rich mass spectra and proved to be a good positive indicator. Finally, the validated method was applied to samples received from the A& E Department of Glasgow Royal Infirmary in suspected drug abuse cases and samples received from the Scottish Prison Service, which we received from their own prevalence study targeting drugs of abuse in the prison population. The obtained data was processed with a library established in the course of this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report aims to present the experience lived in the project "The School Pedro II in the Professional Decision of Secondary Education Students" aimed to promote professional student choice for the preparation of secondary to higher education, technical or job market with the integration of the areas of knowledge and ICT. Starting questions: How to awaken in students a vocation for academic life? How establish the connection between what students want to be in the future and to choose when isn’t a university course? How to take into account the factors that interfere in making professional student decision to build his own knowledge about your chosen profession? The experiment was performed at the State School of Elementary and Secondary Education D. Pedro II (Belem of Para State/Brazil), based on the view that knowledge must be represented in a format that requires coordination with the different forms of knowledge and the organization and use of technology. The results show that the tasks performed by students for professional choice provided information about themselves and the professional world. The conceptual map has contributed as a mediating tool of the teaching, learning and assessment and favored interest, autonomy and participation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students’ examination experience were analysed, leading to the identification of engagement issues in the delivery of high-stakes computer-based assessments.The exam combined short-answer open-response questions with multiple-choice-style items to assess knowledge and understanding of research methods. The findings suggest that engagement with computer-based testing depends, to a lesser extent, on students’ general levels of digital literacy and, to a greater extent, on their information technology (IT) proficiency for assessment and their ability to adapt their test-taking strategies, including organisational and cognitive strategies, to the online assessment environment. The socialisation and preparation of students for computer-based testing therefore emerge as key responsibilities for instructors to address, with students requesting increased opportunities for practice and training to develop the IT skills and test-taking strategies necessary to succeed in computer-based examinations. These findings and their implications in terms of instructional responsibilities form the basis of a proposal for a framework for Learner Engagement with e-Assessment Practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research presented herein aims to investigate the strengths and weaknesses of a relatively new technique called phytoscreening. Parallel to the well-known phytoremediation, it consists of exploiting the absorbing potential of trees to delineate groundwater contamination plumes, especially for chlorinated ethenes (i.e., PCE, TCE, 1,2-cis DCE, and VC). The latter are prevalent contaminants in groundwater but their fate and transport in surface ecosystems, such as trees, are still poorly understood and subjected to high variability. Moreover, the analytical validity of tree-coring is still limited in many countries due to a lack of knowledge of its application opportunities. Tree-cores are extracted from trunks and generally analyzed by gas chromatography/mass spectrometry. A systematic review of former literature on phytoscreening for chlorinated ethenes is presented in this PhD thesis to evaluate the factors influencing the effectiveness of the technique. Besides, we tested the technique by probing eight sites contaminated by chlorinated ethenes in Italy (Emilia-Romagna) in different hydrogeological and seasonal settings. We coupled the technique with the assessment of gaseous-phase concentrations directly on-site, inserting detector tubes or a photoionization detector in the tree-holes left by the coring tool. Finally, we applied rank order statistic analysis on field data along with literature data to assess under which conditions phytoscreening should be applied to either screen or monitor environmental contamination issues. A relatively high correlation exists between tree-core and groundwater concentrations (Spearman’s ρ > 0.6), being higher for compounds with higher sorption, for sites with shallower and thinner aquifers, and when sampling specific tree types with standardized sampling and extraction protocols. These results indicate the opportunities for assessing the occurrence, type, and concentration of solvents directly from the stem of trees. This can reduce the costs of characterization surveys, allowing rapid identification of hotspots and plume direction and thus optimizing the drilling of boreholes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge graphs and ontologies are closely related concepts in the field of knowledge representation. In recent years, knowledge graphs have gained increasing popularity and are serving as essential components in many knowledge engineering projects that view them as crucial to their success. The conceptual foundation of the knowledge graph is provided by ontologies. Ontology modeling is an iterative engineering process that consists of steps such as the elicitation and formalization of requirements, the development, testing, refactoring, and release of the ontology. The testing of the ontology is a crucial and occasionally overlooked step of the process due to the lack of integrated tools to support it. As a result of this gap in the state-of-the-art, the testing of the ontology is completed manually, which requires a considerable amount of time and effort from the ontology engineers. The lack of tool support is noticed in the requirement elicitation process as well. In this aspect, the rise in the adoption and accessibility of knowledge graphs allows for the development and use of automated tools to assist with the elicitation of requirements from such a complementary source of data. Therefore, this doctoral research is focused on developing methods and tools that support the requirement elicitation and testing steps of an ontology engineering process. To support the testing of the ontology, we have developed XDTesting, a web application that is integrated with the GitHub platform that serves as an ontology testing manager. Concurrently, to support the elicitation and documentation of competency questions, we have defined and implemented RevOnt, a method to extract competency questions from knowledge graphs. Both methods are evaluated through their implementation and the results are promising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amid the remarkable growth of innovative technologies, particularly immersive technologies like Extended Reality (XR) (comprising of Virtual Reality (VR), Augmented Reality (AR) & Mixed Reality (MR)), a transformation is unfolding in the way we collaborate and interact. The current research takes the initiative to explore XR’s potential for co-creation activities and proposes XR as a future co-creation platform. It strives to develop a XR-based co-creation system, actively engage stakeholders in the co-creation process, with the goal of enhancing their creative businesses. The research leverages XR tools to investigate how they can enhance digital co-creation methods and determine if the system facilitates efficient and effective value creation during XR-based co-creation sessions. In specific terms, the research probes into whether the XR-based co-creation method and environment enhances the quality and novelty of ideas, reduce communication challenges by providing better understanding of the product, problem or process and optimize the process in terms of reduction in time and costs. The research introduces a multi-user, multi-sensory collaborative and interactive XR platform that adapts to various use-case scenarios. This thesis also presents the user testing performed to collect both qualitative and quantitative data, which serves to substantiate the hypothesis. What sets this XR system apart is its incorporation of fully functional prototypes into a mixed reality environment, providing users with a unique dimension within an immersive digital landscape. The outcomes derived from the experimental studies demonstrate that XR-based co-creation surpasses conventional desktop co-creation methods and remarkably, the results are even comparable to a full mock-up test. In conclusion, the research underscores that the utilization of XR as a tool for co-creation generates substantial value. It serves as a method that enhances the process, an environment that fosters interaction and collaboration, and a platform that equips stakeholders with the means to engage effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear cascade testing serves a fundamental role in the research, development, and design of turbomachines as it is a simple yet very effective way to compute the performance of a generic blade geometry. These kinds of experiments are usually carried out in specialized wind tunnel facilities. This thesis deals with the numerical characterization and subsequent partial redesign of the S-1/C Continuous High Speed Wind Tunnel of the Von Karman Institute for Fluid Dynamics. The current facility is powered by a 13-stage axial compressor that is not powerful enough to balance the energy loss experienced when testing low turning airfoils. In order to address this issue a performance assessment of the wind tunnel was performed under several flow regimes via numerical simulations. After that, a redesign proposal aimed at reducing the pressure loss was investigated. This consists of a linear cascade of turning blades to be placed downstream of the test section and designed specifically for the type of linear cascade being tested. An automatic design procedure was created taking as input parameters those measured at the outlet of the cascade. The parametrization method employed Bézier curves to produce an airfoil geometry that could be imported into a CAD software so that a cascade could be designed. The proposal was simulated via CFD analysis and proved to be effective in reducing pressure losses up to 41%. The same tool developed in this thesis could be adopted to design similar apparatuses and could also be optimized and specialized for the design of turbomachines components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, energy modernization has focused on smart engineering advancements. This entails designing complicated software and hardware for variable-voltage digital substations. A digital substation consists of electrical and auxiliary devices, control and monitoring devices, computers, and control software. Intelligent measurement systems use digital instrument transformers and IEC 61850-compliant information exchange protocols in digital substations. Digital instrument transformers used for real-time high-voltage measurements should combine advanced digital, measuring, information, and communication technologies. Digital instrument transformers should be cheap, small, light, and fire- and explosion-safe. These smaller and lighter transformers allow long-distance transmission of an optical signal that gauges direct or alternating current. Cost-prohibitive optical converters are a problem. To improve the tool's accuracy, amorphous alloys are used in the magnetic circuits and compensating feedback. Large-scale voltage converters can be made cheaper by using resistive, capacitive, or hybrid voltage dividers. In known electronic voltage transformers, the voltage divider output is generally on the low-voltage side, facilitating power supply organization. Combining current and voltage transformers reduces equipment size, installation, and maintenance costs. These two gadgets cost less together than individually. To increase commercial power metering accuracy, current and voltage converters should be included into digital instrument transformers so that simultaneous analogue-to-digital samples are obtained. Multichannel ADC microcircuits with synchronous conversion start allow natural parallel sample drawing. Digital instrument transformers are created adaptable to substation operating circumstances and environmental variables, especially ambient temperature. An embedded microprocessor auto-diagnoses and auto-calibrates the proposed digital instrument transformer.