956 resultados para Production engineering Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cleaning is one of the most important and delicate procedures that are part of the restoration process. When developing new systems, it is fundamental to consider its selectivity towards the layer to-be-removed, non-invasiveness towards the one to-be-preserved, its sustainability and non-toxicity. Besides assessing its efficacy, it is important to understand its mechanism by analytical protocols that strike a balance between cost, practicality, and reliable interpretation of results. In this thesis, the development of cleaning systems based on the coupling of electrospun fabrics (ES) and greener organic solvents is proposed. Electrospinning is a versatile technique that allows the production of micro/nanostructured non-woven mats, which have already been used as absorbents in various scientific fields, but to date, not in the restoration field. The systems produced proved to be effective for the removal of dammar varnish from paintings, where the ES not only act as solvent-binding agents but also as adsorbents towards the partially solubilised varnish due to capillary rise, thus enabling a one-step procedure. They have also been successfully applied for the removal of spray varnish from marble substrates and wall paintings. Due to the materials' complexity, the procedure had to be adapted case-by-case and mechanical action was still necessary. According to the spinning solution, three types of ES mats have been produced: polyamide 6,6, pullulan and pullulan with melanin nanoparticles. The latter, under irradiation, allows for a localised temperature increase accelerating and facilitating the removal of less soluble layers (e.g. reticulated alkyd-based paints). All the systems produced, and the mock-ups used were extensively characterised using multi-analytical protocols. Finally, a monitoring protocol and image treatment based on photoluminescence macro-imaging is proposed. This set-up allowed the study of the removal mechanism of dammar varnish and semi-quantify its residues. These initial results form the basis for optimising the acquisition set-up and data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Riding the wave of recent groundbreaking achievements, artificial intelligence (AI) is currently the buzzword on everybody’s lips and, allowing algorithms to learn from historical data, Machine Learning (ML) emerged as its pinnacle. The multitude of algorithms, each with unique strengths and weaknesses, highlights the absence of a universal solution and poses a challenging optimization problem. In response, automated machine learning (AutoML) navigates vast search spaces within minimal time constraints. By lowering entry barriers, AutoML emerged as promising the democratization of AI, yet facing some challenges. In data-centric AI, the discipline of systematically engineering data used to build an AI system, the challenge of configuring data pipelines is rather simple. We devise a methodology for building effective data pre-processing pipelines in supervised learning as well as a data-centric AutoML solution for unsupervised learning. In human-centric AI, many current AutoML tools were not built around the user but rather around algorithmic ideas, raising ethical and social bias concerns. We contribute by deploying AutoML tools aiming at complementing, instead of replacing, human intelligence. In particular, we provide solutions for single-objective and multi-objective optimization and showcase the challenges and potential of novel interfaces featuring large language models. Finally, there are application areas that rely on numerical simulators, often related to earth observations, they tend to be particularly high-impact and address important challenges such as climate change and crop life cycles. We commit to coupling these physical simulators with (Auto)ML solutions towards a physics-aware AI. Specifically, in precision farming, we design a smart irrigation platform that: allows real-time monitoring of soil moisture, predicts future moisture values, and estimates water demand to schedule the irrigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJETIVO: Estudar a tendência da mortalidade relacionada à doença de Chagas informada em qualquer linha ou parte do atestado médico da declaração de óbito.MÉTODOS: Os dados provieram dos bancos de causas múltiplas de morte da Fundação Sistema Estadual de Análise de Dados de São Paulo (SEADE) entre 1985 e 2006. As causas de morte foram caracterizadas como básicas, associadas (não-básicas) e total de suas menções.RESULTADOS: No período de 22 anos, ocorreram 40 002 óbitos relacionados à doença de Chagas, dos quais 34 917 (87,29%) como causa básica e 5 085 (12,71%) como causa associada. Foi observado um declínio de 56,07% do coeficiente de mortalidade pela causa básica e estabilidade pela causa associada. O número de óbitos foi 44,5% maior entre os homens em relação às mulheres. O fato de 83,5% dos óbitos terem ocorrido a partir dos 45 anos de idade revela um efeito de coorte. As principais causas associadas da doença de Chagas como causa básica foram as complicações diretas do comprometimento cardíaco, como transtornos da condução, arritmias e insuficiência cardíaca. Para a doença de Chagas como causa associada, foram identificadas como causas básicas as doenças isquêmicas do coração, as doenças cerebrovasculares e as neoplasias.CONCLUSÕES: Para o total de suas menções, verificou-se uma queda do coeficiente de mortalidade de 51,34%, ao passo que a queda no número de óbitos foi de apenas 5,91%, tendo sido menor entre as mulheres, com um deslocamento das mortes para as idades mais avançadas. A metodologia das causas múltiplas de morte contribuiu para ampliar o conhecimento da história natural da doença de Chagas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nyvlt method Was used to determine the kinetic parameters of commercial xylitol in ethanol:water (50:50 %w/w) Solution by batch cooling crystallization. The kinetic exponents (n, g and in) and the system kinetic constant (B(N)) were determined. Model experiments were carried Out in order to verify the combined effects of saturation temperatures (40, 50 and 60 degrees C) and cooling rates (0.10, 0.25 and 0.50 degrees C/min) on these parameters. The fitting between experimental and Calculated crystal sizes has 11.30% mean deviation. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optical monitoring systems are necessary to manufacture multilayer thin-film optical filters with low tolerance on spectrum specification. Furthermore, to have better accuracy on the measurement of film thickness, direct monitoring is a must. Direct monitoring implies acquiring spectrum data from the optical component undergoing the film deposition itself, in real time. In making film depositions on surfaces of optical components, the high vacuum evaporator chamber is the most popular equipment. Inside the evaporator, at the top of the chamber, there is a metallic support with several holes where the optical components are assembled. This metallic support has rotary motion to promote film homogenization. To acquire a measurement of the spectrum of the film in deposition, it is necessary to pass a light beam through a glass witness undergoing the film deposition process, and collect a sample of the light beam using a spectrometer. As both the light beam and the light collector are stationary, a synchronization system is required to identify the moment at which the optical component passes through the light beam.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation was performed on the effect of temperature and organic load on the stability and efficiency of a 1.8-L fluidized-bed anaerobic sequencing batch reactor (ASBR), containing granulated biomass. Assays were carried out employing superficial up How velocity of 7 m/h, total cycle length of 6 h and synthetic wastewater volume of 1.3 L per cycle. The fluidized-bed ASH was operated at 15, 20, 25 and 30 degrees C with influent organic matter concentrations of 500 and 1000 mgCOD/L The system showed stability under all conditions and presented filtered samples removal efficiency ranging from 79 to 86%. A first-order kinetic model could be fitted to the experimental values of the organic matter concentration profiles. The specific kinetic parameter values of this model ranged from 0.0435 to 0.2360 L/(gTVS h) at the implemented operation conditions. in addition, from the slope of an Arrhenius plot, the activation energy values were calculated to be 16,729 and 12,673 cal/mol for operation with 500 and 1000 mgCOD/L, respectively. These results show that treatment of synthetic wastewater. with concentration of 500 mgCOD/L, was more sensitive to temperature variations than treatment of the same residue with concentration of 1000 mgCOD/L. Comparing the activation energy value for operation at 500 mgCOD/L with the value obtained by Agibert et al. (S.A. Agibert, M.B. Moreira, S.M. Ratusznei, J.A.D. Rodrigues, M. Zaiat, E. Foresti. Influence of temperature on performance of an ASBBR with circulation applied to treatment of low-strength wastewater. journal of Applied Biochemistry and Biotechnology, 136 (2007) 193-206) in an ASBBR treating the same wastewater at the same concentration, the value obtained in the fluidized-bed ASBR showed to be superior, indicating that treatment of synthetic wastewater in a reactor containing granulated biomass was more sensitive to temperature variations than the treatment using immobilized biomass. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a technological viability study of wastewater treatment in an automobile industry by an anaerobic sequencing batch biofilm reactor containing immobilized biomass (AnSBBR) with a draft tube. The reactor was operated in 8-h cycles, with agitation of 400 rpm, at 30 degrees C and treating 2.0 L wastewater per cycle. Initially the efficiency and stability of the reactor were studied when supplied with nutrients and alkalinity. Removal efficiency of 88% was obtained at volumetric loading rate (VLR) of 3.09 mg COD/L day. When VLR was increased to 6.19 mg COD/L day the system presented stable operation with reduction in efficiency of 71%. In a second stage the AnSBBR was operated treating wastewater in natura, i.e., without nutrients supplementation, only with alkalinity, thereby changing feed strategy. The first strategy consisted in feeding 2.0 L batch wise (10 min), the second in feeding 1.0 L of influent batch wise (10 min) and an additional 1.0 L fed-batch wise (4 h), both dewatering 2.0 L of the effluent in 10 min. The third one maintained 1.0 L of treated effluent in the reactor, without discharging, and 1.0 L of influent was fed fed-batch wise (4 h) with dewatering 1.0 L of the effluent in 10 min. For all implemented strategies (VLR of 1.40, 2.57 and 2.61 mg COD/L day) the system presented stability and removal efficiency of approximately 80%. These results show that the AnSBBR presents operational flexibility, as the influent can be fed according to industry availability. In industrial processes this is a considerable advantage, as the influent may be prone to variations. Moreover, for all the investigated conditions the kinetic parameters were obtained from fitting a first-order model to the profiles of organic matter, total volatile acids and methane concentrations. Analysis of the kinetic parameters showed that the best strategy is feeding 1.0 L of influent batchwise (10 min) and 1.0 L fed-batch wise (4 h) in 8-h cycle. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation has been performed on the effect of liquid phase recirculation velocity and increasing influent concentration on the stability and efficiency of an anaerobic sequencing batch reactor (ASBR) containing granular biomass. The reactor treated 1.3 L synthetic wastewater at 30 degrees C in 6 h cycles. Initially the effect of recirculation velocity was investigated employing velocities of 5, 7 and 10 m/h and influent concentration of 500 mg COD/L. At these velocities, filtered sample organic matter removal efficiencies were 83, 85 and 84%, respectively. A first order kinetic model could also be fitted to the experimental organic matter concentration profiles. The kinetic parameter values of this model were 1.35, 2.36 and 1.00 h(-1) at the recirculation velocities of 5, 7 and 10 m/h, respectively. The recirculation velocity of 7 m/h was found to be the best operating strategy and this value was maintained while the influent concentration was altered in order to verify system efficiency and stability at increasing organic load. Influent concentration of 1000 mg COD/L resulted in filtered sample organic matter removal efficiency of 80%, and a first order kinetic parameter value of 1.14 h(-1), whereas the concentration of 1500 mg COD/L resulted in an efficiency of 82% and a kinetic parameter value of 1.31 h(-1). (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, processing methods of Fourier optics implemented in a digital holographic microscopy system are presented. The proposed methodology is based on the possibility of the digital holography in carrying out the whole reconstruction of the recorded wave front and consequently, the determination of the phase and intensity distribution in any arbitrary plane located between the object and the recording plane. In this way, in digital holographic microscopy the field produced by the objective lens can be reconstructed along its propagation, allowing the reconstruction of the back focal plane of the lens, so that the complex amplitudes of the Fraunhofer diffraction, or equivalently the Fourier transform, of the light distribution across the object can be known. The manipulation of Fourier transform plane makes possible the design of digital methods of optical processing and image analysis. The proposed method has a great practical utility and represents a powerful tool in image analysis and data processing. The theoretical aspects of the method are presented, and its validity has been demonstrated using computer generated holograms and images simulations of microscopic objects. (c) 2007 Elsevier B.V. All rights reserved.