911 resultados para architectural design -- data processing
Resumo:
The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.
Resumo:
This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.
Resumo:
La tesi si incentra nello studio e utilizzo del linguaggio Scala per aspetti di ingestion, processing e plotting di dati, prestando enfasi su time series. Questa è costituita da una prima parte introduttiva sui principali argomenti, per poi concentrarsi sull’analisi dei requisiti, il modello del dominio, il design architetturale e la sua implementazione. Termina infine con qualche nota conclusiva riguardante possibili sviluppi futuri. La parte progettuale consiste nello sviluppo di un’applicazione che supporti le librerie scelte e che favorisca il processo in modo agevole. La validazione del progetto software realizzato viene fatta tramite una sequenza di varie configurazioni a dimostrarne la differenza tra la scelta di determinate opzioni: ciascuna viene accompagnata da una o pi`u immagini che ne dimostrano i risultati ottenuti a seguito dell’uso del programma.
Resumo:
A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.
Resumo:
In this work, we discuss the use of multi-way principal component analysis combined with comprehensive two-dimensional gas chromatography to study the volatile metabolites of the saprophytic fungus Memnoniella sp. isolated in vivo by headspace solid-phase microextraction. This fungus has been identified as having the ability to induce plant resistance against pathogens, possibly through its volatile metabolites. Adequate culture media was inoculated, and its headspace was then sampled with a solid-phase microextraction fiber and chromatographed every 24 h over seven days. The raw chromatogram processing using multi-way principal component analysis allowed the determination of the inoculation period, during which the concentration of volatile metabolites was maximized, as well as the discrimination of the appropriate peaks from the complex culture media background. Several volatile metabolites not previously described in the literature on biocontrol fungi were observed, as well as sesquiterpenes and aliphatic alcohols. These results stress that, due to the complexity of multidimensional chromatographic data, multivariate tools might be mandatory even for apparently trivial tasks, such as the determination of the temporal profile of metabolite production and extinction. However, when compared with conventional gas chromatography, the complex data processing yields a considerable improvement in the information obtained from the samples. This article is protected by copyright. All rights reserved.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Fisica
Resumo:
OBJETIVO: Estudar a tendência da mortalidade relacionada à doença de Chagas informada em qualquer linha ou parte do atestado médico da declaração de óbito.MÉTODOS: Os dados provieram dos bancos de causas múltiplas de morte da Fundação Sistema Estadual de Análise de Dados de São Paulo (SEADE) entre 1985 e 2006. As causas de morte foram caracterizadas como básicas, associadas (não-básicas) e total de suas menções.RESULTADOS: No período de 22 anos, ocorreram 40 002 óbitos relacionados à doença de Chagas, dos quais 34 917 (87,29%) como causa básica e 5 085 (12,71%) como causa associada. Foi observado um declínio de 56,07% do coeficiente de mortalidade pela causa básica e estabilidade pela causa associada. O número de óbitos foi 44,5% maior entre os homens em relação às mulheres. O fato de 83,5% dos óbitos terem ocorrido a partir dos 45 anos de idade revela um efeito de coorte. As principais causas associadas da doença de Chagas como causa básica foram as complicações diretas do comprometimento cardíaco, como transtornos da condução, arritmias e insuficiência cardíaca. Para a doença de Chagas como causa associada, foram identificadas como causas básicas as doenças isquêmicas do coração, as doenças cerebrovasculares e as neoplasias.CONCLUSÕES: Para o total de suas menções, verificou-se uma queda do coeficiente de mortalidade de 51,34%, ao passo que a queda no número de óbitos foi de apenas 5,91%, tendo sido menor entre as mulheres, com um deslocamento das mortes para as idades mais avançadas. A metodologia das causas múltiplas de morte contribuiu para ampliar o conhecimento da história natural da doença de Chagas
Resumo:
This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.
Resumo:
Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.
Resumo:
Optical monitoring systems are necessary to manufacture multilayer thin-film optical filters with low tolerance on spectrum specification. Furthermore, to have better accuracy on the measurement of film thickness, direct monitoring is a must. Direct monitoring implies acquiring spectrum data from the optical component undergoing the film deposition itself, in real time. In making film depositions on surfaces of optical components, the high vacuum evaporator chamber is the most popular equipment. Inside the evaporator, at the top of the chamber, there is a metallic support with several holes where the optical components are assembled. This metallic support has rotary motion to promote film homogenization. To acquire a measurement of the spectrum of the film in deposition, it is necessary to pass a light beam through a glass witness undergoing the film deposition process, and collect a sample of the light beam using a spectrometer. As both the light beam and the light collector are stationary, a synchronization system is required to identify the moment at which the optical component passes through the light beam.
Resumo:
The advantages offered by the electronic component LED (Light Emitting Diode) have resulted in a quick and extensive application of this device in the replacement of incandescent lights. In this combined application, however, the relationship between the design variables and the desired effect or result is very complex and renders it difficult to model using conventional techniques. This paper consists of the development of a technique using artificial neural networks that makes it possible to obtain the luminous intensity values of brake lights using SMD (Surface Mounted Device) LEDs from design data. This technique can be utilized to design any automotive device that uses groups of SMD LEDs. The results of industrial applications using SMD LED are presented to validate the proposed technique.
Resumo:
In this work, an algorithm to compute the envelope of non-destructive testing (NDT) signals is proposed. This method allows increasing the speed and reducing the memory in extensive data processing. Also, this procedure presents advantage of preserving the data information for physical modeling applications of time-dependent measurements. The algorithm is conceived to be applied for analyze data from non-destructive testing. The comparison between different envelope methods and the proposed method, applied to Magnetic Bark Signal (MBN), is studied. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this work, a system using active RFID tags to supervise truck bulk cargo is described. The tags are attached to the bodies of the trucks and readers are distributed in the cargo buildings and attached to weighs and the discharge platforms. PDAs with camera and support to a WiFi network are provided to the inspectors and access points are installed throughout the discharge area to allow effective confirmations of unload actions and the acquisition of pictures for future audit. Broadband radio equipments are used to establish efficient communication links between the weighs and cargo buildings which are usually located very far from each other in the field. A web application software was especially developed to enable robust communication between the equipments for efficient device management, data processing and reports generation to the operating personal. The system was deployed in a cargo station of a Brazilian seashore port. The obtained results prove the effectiveness of the proposed system.
Resumo:
This paper analyzes the production of apartment buildings for the middle-income segment in the city of So Paulo, Brazil, from a historical perspective. Tracing the response to the occupants` needs, the focus is on family profiles and their demands, the relationship between architectural design and marketing, and satisfaction levels of current users. The paper begins with a brief historical overview of how apartment buildings have evolved over the past eight decades, highlighting the consolidation of the tripartite model. Next, it analyzes family profiles and their current needs, which would call for a redesign of domestic space. From a different angle, it shows how the real-estate market reacts to this situation, namely by introducing minor changes in the domestic space that are closely linked to major investments in marketing. This leads to a discussion on the quality of recent architectural designs in light of Post-Occupancy Evaluation (POE) case studies, which corroborate the tendencies previously outlined. The conclusions drawn from the POEs suggest that the market should establish a closer and deeper relationship between the assessment of the human behavior in the domestic space and the architectural quality of homes as a means of increasing satisfaction levels and improving design performance.