37 resultados para Processing Element Array
em Instituto Polit
Resumo:
As indústrias de componentes e acessórios automóveis são um elo fundamental no processo produtivo da indústria automóvel. Neste leque industrial encontra-se a Preh Portugal, Lda, como empresa fornecedora de componentes electrónicos, mais concretamente, painéis de controlo de climatização. Os painéis fornecidos pela Preh aos seus clientes encontram-se sujeitos a rigorosos testes de qualidade e funcionalidade. Neste sentido o teste funcional das teclas surge, relacionando o curso da tecla em função de uma força actuante. Esta relação está comprometida com uma curva característica padrão para o tipo de tecla. Para além destes compromissos, também é necessário que a tecla feche e abra o seu contacto eléctrico. Esta tese foca-se no desenvolvimento do teste de teclas, apresentando uma alteração ao sistema actual com a introdução de um sistema embebido, no intuito de flexibilizar o sistema de teste e reduzindo custos. O sistema embebido pretende dar capacidade de processamento ao teste e, desta forma, substituir o actual computador como elemento de processamento. A solução implementada consistiu numa mudança estrutural, através da inclusão do sistema embebido entre o computador e o sistema de deslocamento. Passando o foco central do processo de teste a residir no sistema embebido, este tem de estabelecer comunicações com os restantes elementos intervenientes no teste. Estabelece comunicações série RS-232 com o sistema de deslocamento (leitura do curso e força na tecla), Ethernet com o computador (comandos, parâmetros e resultados) e CAN com o painel de controlo de climatização (fecho/abertura do contacto eléctrico). A concretização deste projecto resultou numa nova estrutura e aplicação, a qual é facilmente integrada na linha de produção com as vantagens de ser menos onerosa e mais flexível, conforme o pretendido.
Resumo:
The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
An Electrocardiogram (ECG) monitoring system deals with several challenges related with noise sources. The main goal of this text was the study of Adaptive Signal Processing Algorithms for ECG noise reduction when applied to real signals. This document presents an adaptive ltering technique based on Least Mean Square (LMS) algorithm to remove the artefacts caused by electromyography (EMG) and power line noise into ECG signal. For this experiments it was used real noise signals, mainly to observe the di erence between real noise and simulated noise sources. It was obtained very good results due to the ability of noise removing that can be reached with this technique. A recolha de sinais electrocardiogr a cos (ECG) sofre de diversos problemas relacionados com ru dos. O objectivo deste trabalho foi o estudo de algoritmos adaptativos para processamento digital de sinal, para redu c~ao de ru do em sinais ECG reais. Este texto apresenta uma t ecnica de redu c~ao de ru do baseada no algoritmo Least Mean Square (LMS) para remo c~ao de ru dos causados quer pela actividade muscular (EMG) quer por ru dos causados pela rede de energia el ectrica. Para as experiencias foram utilizados ru dos reais, principalmente para aferir a diferen ca de performance do algoritmo entre os sinais reais e os simulados. Foram conseguidos bons resultados, essencialmente devido as excelentes caracter sticas que esta t ecnica tem para remover ru dos.
Resumo:
A multiresidue approach using microwave-assisted extraction and liquid chromatography with photodiode array detection was investigated for the determination of butylate, carbaryl, carbofuran, chlorpropham, ethiofencarb, linuron,metobromuron, and monolinuron in soils. The critical parameters of the developed methodology were studied. Method validation was performed by analyzing freshly and aged spiked soil samples. The recoveries and relative standard deviations reached using the optimized conditions were between 77.0 ± 0.46% and 120 ± 2.9% except for ethiofencarb (46.4 ± 4.4% to 105 ± 1.6%) and butylate (22.1 ± 7.6% to 49.2 ± 11%). Soil samples from five locations of Portugal were analysed.
Resumo:
The Quinone outside Inhibitors (QoI) are one of the most important and recent fungicide groups used in viticulture and also allowed by Integrated Pest Management. Azoxystrobin, kresoxim-methyl and trifloxystrobin are the main active ingredients for treating downy and powdery mildews that can be present in grapes and wines. In this paper, a method is reported for the analysis of these three QoI-fungicides in grapes and wine. After liquid–liquid extraction and a clean-up on commercial silica cartridges, analysis was by isocratic HPLC with diode array detection (DAD) with a run time of 13 min. Confirmation was by solid-phase micro-extraction (SPME), followed by GC/MS determination. The main validation parameters for the three compounds in grapes and wine were a limit of detection up to 0.073mg kg-1, a precision not exceeding 10.0% and an average recovery of 93% ±38.
Resumo:
Background: Temporal lobe epilepsy (TLE) is a neurological disorder that directly affects cortical areas responsible for auditory processing. The resulting abnormalities can be assessed using event-related potentials (ERP), which have high temporal resolution. However, little is known about TLE in terms of dysfunction of early sensory memory encoding or possible correlations between EEGs, linguistic deficits, and seizures. Mismatch negativity (MMN) is an ERP component – elicited by introducing a deviant stimulus while the subject is attending to a repetitive behavioural task – which reflects pre-attentive sensory memory function and reflects neuronal auditory discrimination and perceptional accuracy. Hypothesis: We propose an MMN protocol for future clinical application and research based on the hypothesis that children with TLE may have abnormal MMN for speech and non-speech stimuli. The MMN can be elicited with a passive auditory oddball paradigm, and the abnormalities might be associated with the location and frequency of epileptic seizures. Significance: The suggested protocol might contribute to a better understanding of the neuropsychophysiological basis of MMN. We suggest that in TLE central sound representation may be decreased for speech and non-speech stimuli. Discussion: MMN arises from a difference to speech and non-speech stimuli across electrode sites. TLE in childhood might be a good model for studying topographic and functional auditory processing and its neurodevelopment, pointing to MMN as a possible clinical tool for prognosis, evaluation, follow-up, and rehabilitation for TLE.
Resumo:
TLE in infancy has been the subject of varied research. Topographical and structural evidence is coincident with the neuronal systems responsible for auditory processing of the highest specialization and complexity. Recent studies have been showing the need of a hemispheric asymmetry for an optimization in central auditory processing (CAP) and acquisition and learning of a language system. A new functional research paradigm is required to study mental processes that require methods of cognitive-sensory information analysis processed in very short periods of time (msec), such as the ERPs. Thus, in this article, we hypothesize that the TLE in infancy could be a good model for topographic and functional study of CAP and its development process, contributing to a better understanding of the learning difficulties that children with this neurological disorder have.
Resumo:
Alheiras are a traditional, smoked, fermented meat sausage, produced in Portugal, with an undeniable cultural and gastronomic legacy. In this study, we assessed the nutritional value of this product, as well as the influence of different types of thermal processing. Alheiras from Mirandela were submitted to six different procedures: microwave, skillet, oven, charcoal grill, electric fryer and electric grill. Protein, fat, carbohydrate, minerals, NaCl, and cholesterol contents, as well as fatty acid profile were evaluated. The results show that alheiras are not hypercaloric but an unbalanced foodstuff (high levels of proteins and lipids) and the type of processing has a major impact on their nutritional value. Charcoal grill is the healthiest option: less fat (12.5 g/100 g) and cholesterol (29.3 mg/100 g), corresponding to a lower caloric intake (231.8 kcal, less 13% than the raw ones). Inversely, fried alheiras presented the worst nutritional profile, with the highest levels of fat (18.1 g/100 g) and cholesterol (76.0 g/100 g).
Resumo:
Component joining is typically performed by welding, fastening, or adhesive-bonding. For bonded aerospace applications, adhesives must withstand high-temperatures (200°C or above, depending on the application), which implies their mechanical characterization under identical conditions. The extended finite element method (XFEM) is an enhancement of the finite element method (FEM) that can be used for the strength prediction of bonded structures. This work proposes and validates damage laws for a thin layer of an epoxy adhesive at room temperature (RT), 100, 150, and 200°C using the XFEM. The fracture toughness (G Ic ) and maximum load ( ); in pure tensile loading were defined by testing double-cantilever beam (DCB) and bulk tensile specimens, respectively, which permitted building the damage laws for each temperature. The bulk test results revealed that decreased gradually with the temperature. On the other hand, the value of G Ic of the adhesive, extracted from the DCB data, was shown to be relatively insensitive to temperature up to the glass transition temperature (T g ), while above T g (at 200°C) a great reduction took place. The output of the DCB numerical simulations for the various temperatures showed a good agreement with the experimental results, which validated the obtained data for strength prediction of bonded joints in tension. By the obtained results, the XFEM proved to be an alternative for the accurate strength prediction of bonded structures.
Resumo:
The main purpose of this work is to present and to interpret the change of structure and physical properties of tantalum oxynitride (TaNxOy) thin films, produced by dc reactive magnetron sputtering, by varying the processing parameters. A set of TaNxOy films was prepared by varying the reactive gases flow rate, using a N2/O2 gas mixture with a concentration ratio of 17:3. The different films, obtained by this process, exhibited significant differences. The obtained composition and the interpretation of X-ray diffraction results, shows that, depending on the partial pressure of the reactive gases, the films are: essentially dark grey metallic, when the atomic ratio (N + O)/Ta < 0.1, evidencing a tetragonal β-Ta structure; grey-brownish, when 0.1 < (N + O)/Ta < 1, exhibiting a face-centred cubic (fcc) TaN-like structure; and transparent oxide-type, when (N + O)/Ta > 1, evidencing the existence of Ta2O5, but with an amorphous structure. These transparent films exhibit refractive indexes, in the visible region, always higher than 2.0. The wear resistance of the films is relatively good. The best behaviour was obtained for the films with (N + O)/Ta ≈ 0.5 and (N + O)/Ta ≈ 1.3.
Resumo:
Prostate cancer (PCa) is one of the most incident malignancies worldwide. Although efficient therapy is available for early-stage PCa, treatment of advanced disease is mainly ineffective and remains a clinical challenge. microRNA (miRNA) dysregulation is associated with PCa development and progression. In fact, several studies have reported a widespread downregulation of miRNAs in PCa, which highlights the importance of studying compounds capable of restoring the global miRNA expression. The main aim of this study was to define the usefulness of enoxacin as an anti-tumoral agent in PCa, due to its ability to induce miRNA biogenesis in a TRBP-mediated manner. Using a panel of five PCa cell lines, we observed that all of them were wild type for the TARBP2 gene and expressed TRBP protein. Furthermore, primary prostate carcinomas displayed normal levels of TRBP protein. Remarkably, enoxacin was able to decrease cell viability, induce apoptosis, cause cell cycle arrest, and inhibit the invasiveness of cell lines. Enoxacin was also effective in restoring the global expression of miRNAs. This study is the first to show that PCa cells are highly responsive to the anti-tumoral effects of enoxacin. Therefore, enoxacin constitutes a promising therapeutic agent for PCa.
Resumo:
Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.
Resumo:
Cooperating objects (COs) is a recently coined term used to signify the convergence of classical embedded computer systems, wireless sensor networks and robotics and control. We present essential elements of a reference architecture for scalable data processing for the CO paradigm.