752 resultados para Engineering -- Data processing -- Study and teaching
Resumo:
The issue of institutional engineering has gained a renewed interest with the democratic transitions of the Central and Eastern European countries, as for some states it has become a matter of state survival. The four countries examined in the study – Macedonia, Slovakia, Romania and Bulgaria – exemplify the difficulty in establishing a stable democratic society in the context of the resurgence of national identity. The success of ethnonational minorities in achieving the desired policies affirming or expanding their rights as a group was conditioned upon the cohesion of the minority as well as the permissiveness of state institutions in terms of participation and representation of minority members. The Hungarian minorities in Slovakia and Romania, the Turkish minority in Bulgaria, and the Albanian minority in Macedonia, formed their political organizations to represent their interests. However, in some cases the divergence of strategies or goals between factions of the minority group seriously impeded its ability to obtain the desired concessions from the majority. The difficulty in the pursuit of policies favoring the expansion of minority rights was further exacerbated in some of the cases by the impermissiveness of political institutions. The political parties representing the interest of ethnonational minorities were allowed to participate in elections, although not without suspicions about their intent and even strong opposition from majority groups, but participation in elections and subsequent representation in legislative bodies did not translate into adoption of the desired policies. The ethnonational minorities' inability to effectively influence the decision-making process was the result of the inadequacy of democratic institutions to process these demands and channel them through the normal political process in the absence of majority desire to accommodate them. Despite the promise of democratic institutions to bring about a major overhaul of the policies of forceful assimilation and disregard for minority rights, the four cases analyzed in the study demonstrate that in effect ethnonational minorities continued to be at the mercy of the majority, especially if the minority was unable to position itself as a balancing actor.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
Background: Healthcare worldwide needs translation of basic ideas from engineering into the clinic. Consequently, there is increasing demand for graduates equipped with the knowledge and skills to apply interdisciplinary medicine/engineering approaches to the development of novel solutions for healthcare. The literature provides little guidance regarding barriers to, and facilitators of, effective interdisciplinary learning for engineering and medical students in a team-based project context. Methods: A quantitative survey was distributed to engineering and medical students and staff in two universities, one in Ireland and one in Belgium, to chart knowledge and practice in interdisciplinary learning and teaching, and of the teaching of innovation. Results: We report important differences for staff and students between the disciplines regarding attitudes towards, and perceptions of, the relevance of interdisciplinary learning opportunities, and the role of creativity and innovation. There was agreement across groups concerning preferred learning, instructional styles, and module content. Medical students showed greater resistance to the use of structured creativity tools and interdisciplinary teams. Conclusions: The results of this international survey will help to define the optimal learning conditions under which undergraduate engineering and medicine students can learn to consider the diverse factors which determine the success or failure of a healthcare engineering solution.
Resumo:
Much has been written about Big Data from a technical, economical, juridical and ethical perspective. Still, very little empirical and comparative data is available on how Big Data is approached and regulated in Europe and beyond. This contribution makes a first effort to fill that gap by presenting the reactions to a survey on Big Data from the Data Protection Authorities of fourteen European countries and a comparative legal research of eleven countries. This contribution presents those results, addressing 10 challenges for the regulation of Big Data.
Resumo:
The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.
Resumo:
This short paper presents a numerical method for spatial and temporal downscaling of solar global radiation and mean air temperature data from global weather forecast models and its validation. The final objective is to develop a prediction algorithm to be integrated in energy management models and forecast of energy harvesting in solar thermal systems of medium/low temperature. Initially, hourly prediction and measurement data of solar global radiation and mean air temperature were obtained, being then numerically downscaled to half-hourly prediction values for the location where measurements were taken. The differences between predictions and measurements were analyzed for more than one year of data of mean air temperature and solar global radiation on clear sky days, resulting in relative daily deviations of around -0.9±3.8% and 0.02±3.92%, respectively.
Resumo:
This paper investigates the use of iPads in the assessment of predominantly second year Bachelor of Education (Primary/Early Childhood) pre-service teachers undertaking a physical education and health unit. Within this unit, practical assessment tasks are graded by tutors in a variety of indoor and outdoor settings. The main barriers for the lecturer or tutor for effective assessment in these contexts include limited time to assess and the provision of explicit feedback for large numbers of students, complex assessment procedures, overwhelming record-keeping and assessing students without distracting from the performance being presented. The purpose of this pilot study was to investigate whether incorporating mobile technologies such as iPads to access online rubrics within the Blackboard environment would enhance and simplify the assessment process. Results from the findings indicate that using iPads to access online rubrics was successful in streamlining the assessment process because it provided pre-service teachers with immediate and explicit feedback. In addition, tutors experienced a reduction in the amount of time required for the same workload by allowing quicker forms of feedback via the iPad dictation function. These outcomes have future implications and potential for mobile paperless assessment in other disciplines such as health, environmental science and engineering.
Resumo:
The project aims to gather an understanding of additive manufacturing and other manufacturing 4.0 techniques with an eyesight for industrialization. First the internal material anisotropy of elements created with the most economically feasible FEM technique was established. An understanding of the main drivers for variability for AM was portrayed, with the focus on achieving material internal isotropy. Subsequently, a technique for deposition parameter optimization was presented, further procedure testing was performed following other polymeric materials and composites. A replicability assessment by means of the use of technology 4.0 was proposed, and subsequent industry findings gathered the ultimate need of developing a process that demonstrate how to re-engineer designs in order to show the best results with AM processing. The latest study aims to apply the Industrial Design and Structure Method (IDES) and applying all the knowledge previously stacked into fully reengineer a product with focus of applying tools from 4.0 era, from product feasibility studies, until CAE – FEM analysis and CAM – DfAM. These results would help in making AM and FDM processes a viable option to be combined with composites technologies to achieve a reliable, cost-effective manufacturing method that could also be used for mass market, industry applications.
Resumo:
Il rilevatore Probe for LUminosity MEasurement (PLUME) è un luminometro per l’esperimento LHCb al CERN. Fornirà misurazioni istantanee della luminosità per LHCb durante la Run 3 a LHC. L’obiettivo di questa tesi è di valutare, con dati simulati, le prestazioni attese di PLUME, come l’occupanza dei PMT che compongono il rivelatore, e riportare l’analisi dei primi dati ottenuti da PLUME durante uno scan di Van der Meer. In particolare, sono state ottenuti tre misure del valore della sezione d’urto, necessarie per tarare il rivelatore, ovvero σ1Da = (1.14 ± 0.11) mb, σ1Db = (1.13 ± 0.10) mb, σ2D = (1.20 ± 0.02) mb, dove i pedici 1D e 2D corrispondono a uno scan di Van der Meer unidimensionale e bidimensionale. Tutti i risultati sono in accordo tra loro.
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
In this study, the influence of the processing conditions and the addition of trans-polyoctenylene rubber (TOR) on Mooney viscosity, tensile properties, hardness, tearing resistance, and resilience of natural rubber/styrene-butadiene rubber blends was investigated. The results obtained are explained in light of dynamic mechanical and morphological analyses. Increasing processing time produced a finer blend morphology, which resulted in an improvement in the mechanical properties. The addition of TOR involved an increase in hardness, a decrease in tear resistance, and no effect on the resilience. It resulted in a large decrease in the Mooney viscosity and a slight decrease in the tensile properties if the components of the compounds were not properly mixed. The results indicate that TOR acted more as a plasticizer than a compatibilizer. (c) 2008 Wiley Periodicals, Inc.
Resumo:
In this work, the oxidation of the model pollutant phenol has been studied by means of the O(3), O(3)-UV, and O(3)-H(2)O(2) processes. Experiments were carried out in a fed-batch system to investigate the effects of initial dissolved organic carbon concentration, initial, ozone concentration in the gas phase, the presence or absence of UVC radiation, and initial hydrogen peroxide concentration. Experimental results were used in the modeling of the degradation processes by neural networks in order to simulate DOC-time profiles and evaluate the relative importance of process variables.