31 resultados para Dimensional measurement accuracy
em Instituto Politécnico do Porto, Portugal
Resumo:
The shifted Legendre orthogonal polynomials are used for the numerical solution of a new formulation for the multi-dimensional fractional optimal control problem (M-DFOCP) with a quadratic performance index. The fractional derivatives are described in the Caputo sense. The Lagrange multiplier method for the constrained extremum and the operational matrix of fractional integrals are used together with the help of the properties of the shifted Legendre orthonormal polynomials. The method reduces the M-DFOCP to a simpler problem that consists of solving a system of algebraic equations. For confirming the efficiency and accuracy of the proposed scheme, some test problems are implemented with their approximate solutions.
Resumo:
This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.
Resumo:
Human Computer Interaction (HCl) is to interaction between computers and each person. And context-aware (CA) is very important one of HCI composition. In particular, if there are sequential or continuous tasks between users and devices, among users, and among devices etc, it is important to decide the next action using right CA. And to take perfect decision we have to get together all CA into a structure. We define that structure is Context-Aware Matrix (CAM) in this article. However to make exact decision is too hard for some problems like low accuracy, overhead and bad context by attacker etc. Many researcher has been studying to solve these problems. Moreover, still it has weak point HCI using in safety. In this Article, we propose CAM making include best selecting Server in each area. As a result, moving users could be taken the best way.
Resumo:
Cada vez mais começa a notar-se, na indústria vitivinícola, uma grande preocupação com a qualidade dos seus produtos, motivada pela maior sensibilização e exigência dos consumidores. Deste modo, a presença de defeitos organoléticos no vinho representa uma fonte de perda financeira nesta indústria, pelo que o seu controlo se torna indispensável para que se obtenha um produto de elevada qualidade. Neste sentido, torna-se interessante desenvolver um método de análise que seja rápido de forma a permitir a quantificação simultânea das moléculas identificadas como principais responsáveis pelos distúrbios olfativos dos vinhos. Assim, este trabalho surge com o objetivo de implementar e validar um método para a determinação de contaminantes em vinho por microextração em fase sólida (SPME) e cromatografia gasosa acoplada à espetrometria de massa tandem (GC-MS/MS) e a sua correlação com a análise sensorial. A técnica de microextração em fase sólida é simples e rápida na medida em que não requer um pré-tratamento da amostra. Por sua vez, a análise por GC-MS permite identificar de forma clara os compostos em estudo, nomeadamente, 4-Etilfenol (4-EP), 4-Etilguaiacol (4-EG), 2,4,6-Tricloroanisol (TCA), 2,3,4,6-Tetracloroanisol (TeCA) e 2,4,6-Tribromoanisol (TBA). Foram realizados estudos de otimização das condições de extração, comparando as fibras 100 μm PDMS e 50/30 μm DVB/CAR/PDMS. Obtiveram-se resultados mais satisfatórios, em termos de resposta e da relação sinal/ruído, com a fibra 50/30 μm DVB/CAR/PDMS e estabeleceram-se como condições de extração 55ºC para a temperatura de incubação/extração, uma velocidade de agitação de 250 rpm e 60 minutos de tempo de extração. Ao longo deste trabalho, analisaram-se 50 amostras de vinho, das quais 48 eram amostras de Vinho Tinto do Douro e 2 de Vinho do Porto. Para validar a metodologia foram realizados estudos de linearidade, limiares analíticos, repetibilidade, precisão intermédia e recuperação. De um modo geral, obtiveram-se bons resultados ao nível da linearidade para as gamas de concentração escolhidas. Quanto aos limites de deteção e de quantificação, o 4-EP é o contaminante que apresenta uma gama de concentrações mais alta, notando-se limiares analíticos mais elevados, com valores próximos dos últimos níveis de concentração, oscilando entre 65 e 583 μg/L. No caso dos Anisóis, o TBA apresenta limites de deteção mais baixos, entre 0,4 e 17,0 ng/L. Os limiares analíticos foram validados com recurso a estudos de precisão intermédia e repetibilidade, cujos resultados se encontram dentro das especificações descritas no documento SANCO/10684/2009 (%RSD ≤ 30% para os Anisóis e %RSD ≤ 20% para os Fenóis Voláteis). Foram, ainda, realizados estudos de exatidão recorrendo a ensaios de recuperação e a ensaios interlaboratoriais. Muitas vezes conseguem-se boas recuperações, no entanto notam-se maiores dificuldades para o TBA e para o TeCA. Relativamente aos ensaios interlaboratoriais, verificam-se maiores discrepâncias para o 4-EP. Já os restantes contaminantes apresentam resultados, geralmente, satisfatórios (|z-score| ≤ 2).
Resumo:
The tongue is the most important and dynamic articulator for speech formation, because of its anatomic aspects (particularly, the large volume of this muscular organ comparatively to the surrounding organs of the vocal tract) and also due to the wide range of movements and flexibility that are involved. In speech communication research, a variety of techniques have been used for measuring the three-dimensional vocal tract shapes. More recently, magnetic resonance imaging (MRI) becomes common; mainly, because this technique allows the collection of a set of static and dynamic images that can represent the entire vocal tract along any orientation. Over the years, different anatomical organs of the vocal tract have been modelled; namely, 2D and 3D tongue models, using parametric or statistical modelling procedures. Our aims are to present and describe some 3D reconstructed models from MRI data, for one subject uttering sustained articulations of some typical Portuguese sounds. Thus, we present a 3D database of the tongue obtained by stack combinations with the subject articulating Portuguese vowels. This 3D knowledge of the speech organs could be very important; especially, for clinical purposes (for example, for the assessment of articulatory impairments followed by tongue surgery in speech rehabilitation), and also for a better understanding of acoustic theory in speech formation.
Resumo:
A novel biomimetic sensor for the potentiometric transduction of oxytetracycline is presented. The artificial host was imprinted in methacrylic acid and/or acrylamide based polymers. Different amounts of molecularly imprinted and non-imprinted polymers were dispersed in different plasticizing solvents and entrapped in a poly(vinyl chloride) matrix. Only molecularly imprinted based sensors allowed a potentiometric transduction, suggesting the existence of host–guest interactions. These sensors exhibited a near-Nernstian response in steady state evaluations; slopes and detection limits ranged 42–63 mV/decade and 2.5–31.3 µg/mL, respectively. Sensors were independent from the pH of test solutions within 2–5. Good selectivity was observed towards glycine, ciprofloxacin, creatinine, acid nalidixic, sulfadiazine, cysteine, hydroxylamine and lactose. In flowing media, the biomimetic sensors presented good reproducibility (RSD of ±0.7%), fast response, good sensitivity (65 mV/decade), wide linear range (5.0×10−5 to 1.0×10−2 mol/L), low detection limit (19.8 µg/mL), and a stable baseline for a 5×10−3M citrate buffer (pH 2.5) carrier. The sensors were successfully applied to the analysis of drugs and urine. This work confirms the possibility of using molecularly imprinted polymers as ionophores for organic ion recognition in potentiometric transduction.
Resumo:
Absolute positioning – the real time satellite based positioning technique that relies solely on global navigation satellite systems – lacks accuracy for several real time application domains. To provide increased positioning quality, ground or satellite based augmentation systems can be devised, depending on the extent of the area to cover. The underlying technique – multiple reference station differential positioning – can, in the case of ground systems, be further enhanced through the implementation of the virtual reference station concept. Our approach is a ground based system made of a small-sized network of three stations where the concept of virtual reference station was implemented. The stations provide code pseudorange corrections, which are combined using a measurement domain approach inversely proportional to the distance from source station to rover. All data links are established trough the Internet.
Resumo:
Radio interference drastically affects the performance of sensor-net communications, leading to packet loss and reduced energy-efficiency. As an increasing number of wireless devices operates on the same ISM frequencies, there is a strong need for understanding and debugging the performance of existing sensornet protocols under interference. Doing so requires a low-cost flexible testbed infrastructure that allows the repeatable generation of a wide range of interference patterns. Unfortunately, to date, existing sensornet testbeds lack such capabilities, and do not permit to study easily the coexistence problems between devices sharing the same frequencies. This paper addresses the current lack of such an infrastructure by using off-the-shelf sensor motes to record and playback interference patterns as well as to generate customizable and repeat-able interference in real-time. We propose and develop JamLab: a low-cost infrastructure to augment existing sensornet testbeds with accurate interference generation while limiting the overhead to a simple upload of the appropriate software. We explain how we tackle the hardware limitations and get an accurate measurement and regeneration of interference, and we experimentally evaluate the accuracy of JamLab with respect to time, space, and intensity. We further use JamLab to characterize the impact of interference on sensornet MAC protocols.
Resumo:
Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.
Resumo:
This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.
Resumo:
The widespread employment of carbon-epoxy laminates in high responsibility and severely loaded applications introduces an issue regarding their handling after damage. Repair of these structures should be evaluated, instead of their disposal, for cost saving and ecological purposes. Under this perspective, the availability of efficient repair methods is essential to restore the strength of the structure. The development and validation of accurate predictive tools for the repairs behaviour are also extremely important, allowing the reduction of costs and time associated to extensive test programmes. Comparing with strap repairs, scarf repairs have the advantages of a higher efficiency and the absence of aerodynamic disturbance. This work reports on a numerical study of the tensile behaviour of three-dimensional scarf repairs in carbon-epoxy structures, using a ductile adhesive (Araldite® 2015). The finite elements analysis was performed in ABAQUS® and Cohesive Zone Modelling was used for the simulation of damage onset and growth in the adhesive layer. Trapezoidal cohesive laws in each pure mode were used to account for the ductility of the specific adhesive mentioned. A parametric study was performed on the repair width and scarf angle. The use of over-laminating plies covering the repaired region at the outer or both repair surfaces was also tested as an attempt to increase the repairs efficiency. The obtained results allowed the proposal of design principles for repairing composite structures.
Resumo:
The interlaminar fracture toughness in pure mode II (GIIc) of a Carbon-Fibre Reinforced Plastic (CFRP) composite is characterized experimentally and numerically in this work, using the End-Notched Flexure (ENF) fracture characterization test. The value of GIIc was extracted by a new data reduction scheme avoiding the crack length measurement, named Compliance-Based Beam Method (CBBM). This method eliminates the crack measurement errors, which can be non-negligible, and reflect on the accuracy of the fracture energy calculations. Moreover, it accounts for the Fracture Process Zone (FPZ) effects. A numerical study using the Finite Element Method (FEM) and a triangular cohesive damage model, implemented within interface finite elements and based on the indirect use of Fracture Mechanics, was performed to evaluate the suitability of the CBBM to obtain GIIc. This was performed comparing the input values of GIIc in the numerical models with the ones resulting from the application of the CBBM to the numerical load-displacement (P-) curve. In this numerical study, the Compliance Calibration Method (CCM) was also used to extract GIIc, for comparison purposes.
Resumo:
Mestrado em Engenharia Química - Ramo Optimização Energética na Indústria Química
Resumo:
Recent studies of mobile Web trends show a continuous explosion of mobile-friendly content. However, the increasing number and heterogeneity of mobile devices poses several challenges for Web programmers who want to automatically get the delivery context and adapt the content to mobile devices. In this process, the devices detection phase assumes an important role where an inaccurate detection could result in a poor mobile experience for the enduser. In this paper we compare the most promising approaches for mobile device detection. Based on this study, we present an architecture for a system to detect and deliver uniform m-Learning content to students in a Higher School. We focus mainly on the devices capabilities repository manageable and accessible through an API. We detail the structure of the capabilities XML Schema that formalizes the data within the devices capabilities XML repository and the REST Web Service API for selecting the correspondent devices capabilities data according to a specific request. Finally, we validate our approach by presenting the access and usage statistics of the mobile web interface of the proposed system such as hits and new visitors, mobile platforms, average time on site and rejection rate.
Resumo:
Considering tobacco smoke as one of the most health-relevant indoor sources, the aim of this work was to further understand its negative impacts on human health. The specific objectives of this work were to evaluate the levels of particulate-bound PAHs in smoking and non-smoking homes and to assess the risks associated with inhalation exposure to these compounds. The developed work concerned the application of the toxicity equivalency factors approach (including the estimation of the lifetime lung cancer risks, WHO) and the methodology established by USEPA (considering three different age categories) to 18 PAHs detected in inhalable (PM10) and fine (PM2.5) particles at two homes. The total concentrations of 18 PAHs (ΣPAHs) was 17.1 and 16.6 ng m−3 in PM10 and PM2.5 at smoking home and 7.60 and 7.16 ng m−3 in PM10 and PM2.5 at non-smoking one. Compounds with five and six rings composed the majority of the particulate PAHs content (i.e., 73 and 78 % of ΣPAHs at the smoking and non-smoking home, respectively). Target carcinogenic risks exceeded USEPA health-based guideline at smoking home for 2 different age categories. Estimated values of lifetime lung cancer risks largely exceeded (68–200 times) the health-based guideline levels at both homes thus demonstrating that long-term exposure to PAHs at the respective levels would eventually cause risk of developing cancer. The high determined values of cancer risks in the absence of smoking were probably caused by contribution of PAHs from outdoor sources.