958 resultados para Iron foundries Production control Data processing
Resumo:
Model predictive control (MPC) has often been referred to in literature as a potential method for more efficient control of building heating systems. Though a significant performance improvement can be achieved with an MPC strategy, the complexity introduced to the commissioning of the system is often prohibitive. Models are required which can capture the thermodynamic properties of the building with sufficient accuracy for meaningful predictions to be made. Furthermore, a large number of tuning weights may need to be determined to achieve a desired performance. For MPC to become a practicable alternative, these issues must be addressed. Acknowledging the impact of the external environment as well as the interaction of occupants on the thermal behaviour of the building, in this work, techniques have been developed for deriving building models from data in which large, unmeasured disturbances are present. A spatio-temporal filtering process was introduced to determine estimates of the disturbances from measured data, which were then incorporated with metaheuristic search techniques to derive high-order simulation models, capable of replicating the thermal dynamics of a building. While a high-order simulation model allowed for control strategies to be analysed and compared, low-order models were required for use within the MPC strategy itself. The disturbance estimation techniques were adapted for use with system-identification methods to derive such models. MPC formulations were then derived to enable a more straightforward commissioning process and implemented in a validated simulation platform. A prioritised-objective strategy was developed which allowed for the tuning parameters typically associated with an MPC cost function to be omitted from the formulation by separation of the conflicting requirements of comfort satisfaction and energy reduction within a lexicographic framework. The improved ability of the formulation to be set-up and reconfigured in faulted conditions was shown.
Resumo:
La actividad física regular desempeña un papel fundamental en la prevención y control de los desórdenes musculo esqueléticos, dentro de la actividad laboral del profesor de educación física. Objetivo: El propósito del estudio fue determinar la relación entre los niveles de actividad física y la prevalencia de los desórdenes musculo esqueléticos, en profesores de educación física de 42 instituciones educativas oficiales de Bogotá-Colombia. Métodos. Se trata de un estudio de corte transversal en 262 profesores de educación física, de 42 instituciones educativas oficiales de Bogotá - Colombia. Se aplicó de manera auto-diligenciada el Cuestionario Nórdico de desórdenes músculos esqueléticos y el Cuestionario IPAQ versión corta para identificar los niveles de actividad física. Se obtuvieron medidas de tendencia central y de dispersión para variables cuantitativas y frecuencias relativas para variables cualitativas. Se calculó la prevalencia de vida y el porcentaje de reubicación laboral en los docentes que habían padecido diferentes tipo de dolor. Para estimar la relación entre el dolor y las variables sociodemográficas de los docentes, se utilizó un modelo de regresión logística binaria simple. Los análisis fueron realizados en SPSS versión 20 y se consideró como significativo un valor p < 0.05 para el contraste de hipótesis y un nivel de confianza para la estimación de parámetros. Resultados: El porcentaje de respuesta fue del 83.9%, se consideraron válidos 262 registros, 22.5% eran de género femenino, la mayor cantidad de docentes de educación física se encuentraon entre 25 y 35 años (43,9%), frente a los desórdenes musculo esqueléticos, el 16.9% de los profesores reporto haberlos sufrido alguna vez molestias en el cuello, el 17,2% en el hombro, 27,9% espalda, 7.93% brazo y en mano el 8.4%. Los profesores con mayores niveles de actividad física, reportaron una prevalencia menor de alteraciones musculo esqueléticas de 16,9 % para cuello; 27.7% para dorsal/lumbar frente a los sujetos con niveles bajos de actividad física. La presencia de los desórdenes se asoció a los años de experiencia (OR 3.39 IC95% 1.41-7.65), a pertenecer al género femenino (OR 4.94 IC95% 1.94-12.59), a la edad (OR 5.06 IC95% 1.25-20.59), y al atender más de 400 estudiantes a cargo dentro de la jornada laboral (OR 4.50 IC95% 1.74-11.62). Conclusiones: En los profesores de Educación Física no sé encontró una relación estadísticamente significativa entre los niveles de actividad física y los desórdenes musculo esqueléticos medidos por auto reporte.
Resumo:
In the digital age, e-health technologies play a pivotal role in the processing of medical information. As personal health data represents sensitive information concerning a data subject, enhancing data protection and security of systems and practices has become a primary concern. In recent years, there has been an increasing interest in the concept of Privacy by Design, which aims at developing a product or a service in a way that it supports privacy principles and rules. In the EU, Article 25 of the General Data Protection Regulation provides a binding obligation of implementing Data Protection by Design technical and organisational measures. This thesis explores how an e-health system could be developed and how data processing activities could be carried out to apply data protection principles and requirements from the design stage. The research attempts to bridge the gap between the legal and technical disciplines on DPbD by providing a set of guidelines for the implementation of the principle. The work is based on literature review, legal and comparative analysis, and investigation of the existing technical solutions and engineering methodologies. The work can be differentiated by theoretical and applied perspectives. First, it critically conducts a legal analysis on the principle of PbD and it studies the DPbD legal obligation and the related provisions. Later, the research contextualises the rule in the health care field by investigating the applicable legal framework for personal health data processing. Moreover, the research focuses on the US legal system by conducting a comparative analysis. Adopting an applied perspective, the research investigates the existing technical methodologies and tools to design data protection and it proposes a set of comprehensive DPbD organisational and technical guidelines for a crucial case study, that is an Electronic Health Record system.
Resumo:
The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.
Resumo:
The final goal of the thesis should be a real-world application in the production test data environment. This includes the pre-processing of the data, building models and visualizing the results. To do this, different machine learning models, outlier prediction oriented, should be investigated using a real dataset. Finally, the different outlier prediction algorithms should be compared, and their performance discussed.
Resumo:
The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.
Resumo:
Cleaning is one of the most important and delicate procedures that are part of the restoration process. When developing new systems, it is fundamental to consider its selectivity towards the layer to-be-removed, non-invasiveness towards the one to-be-preserved, its sustainability and non-toxicity. Besides assessing its efficacy, it is important to understand its mechanism by analytical protocols that strike a balance between cost, practicality, and reliable interpretation of results. In this thesis, the development of cleaning systems based on the coupling of electrospun fabrics (ES) and greener organic solvents is proposed. Electrospinning is a versatile technique that allows the production of micro/nanostructured non-woven mats, which have already been used as absorbents in various scientific fields, but to date, not in the restoration field. The systems produced proved to be effective for the removal of dammar varnish from paintings, where the ES not only act as solvent-binding agents but also as adsorbents towards the partially solubilised varnish due to capillary rise, thus enabling a one-step procedure. They have also been successfully applied for the removal of spray varnish from marble substrates and wall paintings. Due to the materials' complexity, the procedure had to be adapted case-by-case and mechanical action was still necessary. According to the spinning solution, three types of ES mats have been produced: polyamide 6,6, pullulan and pullulan with melanin nanoparticles. The latter, under irradiation, allows for a localised temperature increase accelerating and facilitating the removal of less soluble layers (e.g. reticulated alkyd-based paints). All the systems produced, and the mock-ups used were extensively characterised using multi-analytical protocols. Finally, a monitoring protocol and image treatment based on photoluminescence macro-imaging is proposed. This set-up allowed the study of the removal mechanism of dammar varnish and semi-quantify its residues. These initial results form the basis for optimising the acquisition set-up and data processing.
Resumo:
This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.
Resumo:
A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.
Resumo:
OBJETIVO: Estudar a tendência da mortalidade relacionada à doença de Chagas informada em qualquer linha ou parte do atestado médico da declaração de óbito.MÉTODOS: Os dados provieram dos bancos de causas múltiplas de morte da Fundação Sistema Estadual de Análise de Dados de São Paulo (SEADE) entre 1985 e 2006. As causas de morte foram caracterizadas como básicas, associadas (não-básicas) e total de suas menções.RESULTADOS: No período de 22 anos, ocorreram 40 002 óbitos relacionados à doença de Chagas, dos quais 34 917 (87,29%) como causa básica e 5 085 (12,71%) como causa associada. Foi observado um declínio de 56,07% do coeficiente de mortalidade pela causa básica e estabilidade pela causa associada. O número de óbitos foi 44,5% maior entre os homens em relação às mulheres. O fato de 83,5% dos óbitos terem ocorrido a partir dos 45 anos de idade revela um efeito de coorte. As principais causas associadas da doença de Chagas como causa básica foram as complicações diretas do comprometimento cardíaco, como transtornos da condução, arritmias e insuficiência cardíaca. Para a doença de Chagas como causa associada, foram identificadas como causas básicas as doenças isquêmicas do coração, as doenças cerebrovasculares e as neoplasias.CONCLUSÕES: Para o total de suas menções, verificou-se uma queda do coeficiente de mortalidade de 51,34%, ao passo que a queda no número de óbitos foi de apenas 5,91%, tendo sido menor entre as mulheres, com um deslocamento das mortes para as idades mais avançadas. A metodologia das causas múltiplas de morte contribuiu para ampliar o conhecimento da história natural da doença de Chagas
Resumo:
OBJETIVO: Avaliar a efetividade da suplementação universal profilática com sulfato ferroso, em administração diária ou semanal, na prevenção da anemia em lactentes. MÉTODOS: Ensaio de campo randomizado com crianças de seis a 12 meses de idade, atendidas em unidades básicas de saúde do município do Rio de Janeiro, em 2004-2005. Foram formadas três coortes concorrentes com suplementação universal com sulfato ferroso com grupos: diário (n=150; 12,5mgFe/dia), semanal (n=147; 25mgFe/semana) e controle (n=94). A intervenção durou 24 semanas e foi acompanhada por ações educativas promotoras de adesão. A concentração de hemoglobina sérica foi analisada segundo sua distribuição, média e prevalência de anemia (Hb<110,0g/L) aos 12 meses de idade. A avaliação da efetividade foi realizada segundo intenção de tratar e adesão ao protocolo, utilizando-se análises de regressão múltipla (linear e de Poisson). RESULTADOS: Os grupos mostraram-se homogêneos quanto às variáveis de caracterização. A intervenção foi operacionalizada com sucesso, com elevada adesão ao protocolo em ambos os grupos expostos a ela, sem diferença estatística entre eles. Após ajuste, somente o esquema diário apresentou efeito protetor. Na análise por adesão, o esquema diário apresentou evidente efeito dose-resposta para média de hemoglobina sérica e prevalência de anemia, não sendo observado nenhum efeito protetor do esquema semanal. CONCLUSÕES: Apenas o esquema diário de suplementação universal com sulfato ferroso dos seis aos 12 meses de idade foi efetivo em aumentar a concentração de hemoglobina sérica e em reduzir o risco de anemia
Resumo:
This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.
Resumo:
Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.
Resumo:
The glued- laminated lumber (glulam) technique is an efficient process for the rational use of wood. Fiber-reinforced polymer (FRPs) associated with glulam beams provide significant improvements in strength and stiffness and alter the failure mode of these structural elements. In this context, this paper presents guidance for glulam beam production, an experimental analysis of glulam beams made of Pinus caribea var. hondurensis species without and with externally-bonded FRP and theoretical models to evaluate reinforced glulam beams (bending strength and stiffness). Concerning the bending strength of the beams, this paper aims only to analyze the limit state of ultimate strength in compression and tension. A specific disposal was used in order to avoid lateral buckling, once the tested beams have a higher ratio height-to-width. The results indicate the need of production control so as to guarantee a higher efficiency of the glulam beams. The FRP introduced in the tensile section of glulam beams resulted in improvements on their bending strength and stiffness due to the reinforcement thickness increase. During the beams testing, two failure stages were observed. The first was a tensile failure on the sheet positioned under the reinforcement layer, while the second occurred as a result of a preliminary compression yielding on the upper side of the lumber, followed by both a shear failure on the fiber-lumber interface and a tensile failure in wood. The model shows a good correlation between the experimental and estimated results.
Resumo:
In this work, an algorithm to compute the envelope of non-destructive testing (NDT) signals is proposed. This method allows increasing the speed and reducing the memory in extensive data processing. Also, this procedure presents advantage of preserving the data information for physical modeling applications of time-dependent measurements. The algorithm is conceived to be applied for analyze data from non-destructive testing. The comparison between different envelope methods and the proposed method, applied to Magnetic Bark Signal (MBN), is studied. (C) 2010 Elsevier Ltd. All rights reserved.