956 resultados para Mathematical Techniques--Error Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The wave energy industry is entering a new phase of pre-commercial and commercial deployments of full-scale devices, so better understanding of seaway variability is critical to the successful operation of devices. The response of Wave Energy Converters to incident waves govern their operational performance and for many devices, this is highly dependent on spectral shape due to their resonant properties. Various methods of wave measurement are presented, along with analysis techniques and empirical models. Resource assessments, device performance predictions and monitoring of operational devices will often be based on summary statistics and assume a standard spectral shape such as Pierson-Moskowitz or JONSWAP. Furthermore, these are typically derived from the closest available wave data, frequently separated from the site on scales in the order of 1km. Therefore, variability of seaways from standard spectral shapes and spatial inconsistency between the measurement point and the device site will cause inaccuracies in the performance assessment. This thesis categorises time and frequency domain analysis techniques that can be used to identify changes in a sea state from record to record. Device specific issues such as dimensional scaling of sea states and power output are discussed along with potential differences that arise in estimated and actual output power of a WEC due to spectral shape variation. This is investigated using measured data from various phases of device development.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Min/max autocorrelation factor analysis (MAFA) and dynamic factor analysis (DFA) are complementary techniques for analysing short (> 15-25 y), non-stationary, multivariate data sets. We illustrate the two techniques using catch rate (cpue) time-series (1982-2001) for 17 species caught during trawl surveys off Mauritania, with the NAO index, an upwelling index, sea surface temperature, and an index of fishing effort as explanatory variables. Both techniques gave coherent results, the most important common trend being a decrease in cpue during the latter half of the time-series, and the next important being an increase during the first half. A DFA model with SST and UPW as explanatory variables and two common trends gave good fits to most of the cpue time-series. (c) 2004 International Council for the Exploration of the Sea. Published by Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper focusses on the study of the underdrawings of 16th century easel paintings attributed to the workshop of the Portuguese-Flemish Master Frei Carlos. This investigation encompasses multidisciplinary research that relates the results of surface exams (infrared reflectography, standard light photography and infrared photography) with analytical investigations. The surface analysis of Frei Carlos’ underdrawings by infrared reflectography has shown heterogeneous work, revealing two different situations: (1) an abundant and expressive underdrawing, revealing a Flemish influence and (2) a simple and outlined underdrawing. This preliminary research raised an important question related to this Portuguese-Flemish workshop and to the analytical approach: Is the underdrawing's heterogeneity, as observed in the reflectograms, related to different artists or is this rather an effect that is produced due to the use of different materials in the underdrawing's execution? Consequently, if different materials were used, how can we have access to the hidden underdrawings? In order to understand the reasons for this dissemblance, chemical analysis of micro-samples collected in underdrawing areas and representing both situations were carried out by optical microscopy, micro Fourier transform infrared spectroscopy (μ-FTIR), scanning electron microscopy coupled with energy dispersive X-ray spectrometry (SEM-EDX) and micro-Raman spectroscopy (μ-Raman). Taking into account the different possibilities and practical and theoretical limitations of surface and punctual examinations in the study of easel painting underdrawings, the methodology of research was adjusted, sometimes resulting in a re-analysis of experimental results. This research shows the importance of combining multispectral surface exams and chemical analysis in the understanding of the artistic creative processes of 16th century easel paintings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today’s data are increasingly complex and classical statistical techniques need growingly more refined mathematical tools to be able to model and investigate them. Paradigmatic situations are represented by data which need to be considered up to some kind of trans- formation and all those circumstances in which the analyst finds himself in the need of defining a general concept of shape. Topological Data Analysis (TDA) is a field which is fundamentally contributing to such challenges by extracting topological information from data with a plethora of interpretable and computationally accessible pipelines. We con- tribute to this field by developing a series of novel tools, techniques and applications to work with a particular topological summary called merge tree. To analyze sets of merge trees we introduce a novel metric structure along with an algorithm to compute it, define a framework to compare different functions defined on merge trees and investigate the metric space obtained with the aforementioned metric. Different geometric and topolog- ical properties of the space of merge trees are established, with the aim of obtaining a deeper understanding of such trees. To showcase the effectiveness of the proposed metric, we develop an application in the field of Functional Data Analysis, working with functions up to homeomorphic reparametrization, and in the field of radiomics, where each patient is represented via a clustering dendrogram.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The deployment of ultra-dense networks is one of the most promising solutions to manage the phenomenon of co-channel interference that affects the latest wireless communication systems, especially in hotspots. To meet the requirements of the use-cases and the immense amount of traffic generated in these scenarios, 5G ultra-dense networks are being deployed using various technologies, such as distributed antenna system (DAS) and cloud-radio access network (C-RAN). Through these centralized densification schemes, virtualized baseband processing units coordinate the distributed access points and manage the available network resources. In particular, link adaptation techniques are shown to be fundamental to overall system operation and performance enhancement. The core of this dissertation is the result of an analysis and a comparison of dynamic and adaptive methods for modulation and coding scheme (MCS) selection applied to the latest mobile telecommunications standards. A novel algorithm based on the proportional-integral-derivative (PID) controller principles and block error rate (BLER) target has been proposed. Tests were conducted in a 4G and 5G system level laboratory and, by means of a channel emulator, the performance was evaluated for different channel models and target BLERs. Furthermore, due to the intrinsic sectorization of the end-users distribution in the investigated scenario, a preliminary analysis on the joint application of users grouping algorithms with multi-antenna and multi-user techniques has been performed. In conclusion, the importance and impact of other fundamental physical layer operations, such as channel estimation and power control, on the overall end-to-end system behavior and performance were highlighted.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spectrum of radiofrequency is distributed in such a way that it is fixed to certain users called licensed users and it cannot be used by unlicensed users even though the spectrum is not in use. This inefficient use of spectrum leads to spectral holes. To overcome the problem of spectral holes and increase the efficiency of the spectrum, Cognitive Radio (CR) was used and all simulation work was done on MATLAB. Here analyzed the performance of different spectrum sensing techniques as Match filter based spectrum sensing and energy detection, which depend on various factors, systems such as Numbers of input, signal-to-noise ratio ( SNR Ratio), QPSK system and BPSK system, and different fading channels, to identify the best possible channels and systems for spectrum sensing and improving the probability of detection. The study resulted that an averaging filter being better than an IIR filter. As the number of inputs and SNR increased, the probability of detection also improved. The Rayleigh fading channel has a better performance compared to the Rician and Nakagami fading channel.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the field of industrial automation, there is an increasing need to use optimal control systems that have low tracking errors and low power and energy consumption. The motors we are dealing with are mainly Permanent Magnet Synchronous Motors (PMSMs), controlled by 3 different types of controllers: a position controller, a speed controller, and a current controller. In this thesis, therefore, we are going to act on the gains of the first two controllers by going to find, through the TwinCAT 3 software, what might be the best set of parameters. To do this, starting with the default parameters recommended by TwinCAT, two main methods were used and then compared: the method of Ziegler and Nichols, which is a tabular method, and advanced tuning, an auto-tuning software method of TwinCAT. Therefore, in order to analyse which set of parameters was the best,several experiments were performed for each case, using the Motion Control Function Blocks. Moreover, some machines, such as large robotic arms, have vibration problems. To analyse them in detail, it was necessary to use the Bode Plot tool, which, through Bode plots, highlights in which frequencies there are resonance and anti-resonance peaks. This tool also makes it easier to figure out which and where to apply filters to improve control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this investigation was to compare the skeletal stability of three different rigid fixation methods after mandibular advancement. Fifty-five class II malocclusion patients treated with the use of bilateral sagittal split ramus osteotomy and mandibular advancement were selected for this retrospective study. Group 1 (n = 17) had miniplates with monocortical screws, Group 2 (n = 16) had bicortical screws and Group 3 (n = 22) had the osteotomy fixed by means of the hybrid technique. Cephalograms were taken preoperatively, 1 week within the postoperative care period, and 6 months after the orthognathic surgery. Linear and angular changes of the cephalometric landmarks of the chin region were measured at each period, and the changes at each cephalometric landmark were determined for the time gaps. Postoperative changes in the mandibular shape were analyzed to determine the stability of fixation methods. There was minimum difference in the relapse of the mandibular advancement among the three groups. Statistical analysis showed no significant difference in postoperative stability. However, a positive correlation between the amount of advancement and the amount of postoperative relapse was demonstrated by the linear multiple regression test (p < 0.05). It can be concluded that all techniques can be used to obtain stable postoperative results in mandibular advancement after 6 months.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of screening techniques, such as an alternative light source (ALS), is important for finding biological evidence at a crime scene. The objective of this study was to evaluate whether biological fluid (blood, semen, saliva, and urine) deposited on different surfaces changes as a function of the age of the sample. Stains were illuminated with a Megamaxx™ ALS System and photographed with a Canon EOS Utility™ camera. Adobe Photoshop™ was utilized to prepare photographs for analysis, and then ImageJ™ was used to record the brightness values of pixels in the images. Data were submitted to analysis of variance using a generalized linear mixed model with two fixed effects (surface and fluid). Time was treated as a random effect (through repeated measures) with a first-order autoregressive covariance structure. Means of significant effects were compared by the Tukey test. The fluorescence of the analyzed biological material varied depending on the age of the sample. Fluorescence was lower when the samples were moist. Fluorescence remained constant when the sample was dry, up to the maximum period analyzed (60 days), independent of the substrate on which the fluid was deposited, showing the novelty of this study. Therefore, the forensic expert can detect biological fluids at the crime scene using an ALS even several days after a crime has occurred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A proper cast is essential for a successful rehabilitation with implant prostheses, in order to produce better structures and induce less strain on the implants. The aim of this study was to evaluate the precision of four different mold filling techniques and verify an accurate methodology to evaluate these techniques. A total of 40 casts were obtained from a metallic matrix simulating three unit implant-retained prostheses. The molds were filled using four different techniques in four groups (n = 10): Group 1 - Single-portion filling technique; Group 2 - Two-step filling technique; Group 3 - Latex cylinder technique; Group 4 - Joining the implant analogs previously to the mold filling. A titanium framework was obtained and used as a reference to evaluate the marginal misfit and tension forces in each cast. Vertical misfit was measured with an optical microscope with an increase of 120 times following the single-screw test protocol. Strain was quantified using strain gauges. Data were analyzed using one-way ANOVA (Tukey's test) (α =0.05). The correlation between strain and vertical misfit was evaluated by Pearson test. The misfit values did not present statistical difference (P = 0.979), while the strain results showed statistical difference between Groups 3 and 4 (P = 0.027). The splinting technique was considered to be as efficient as the conventional technique. The strain gauge methodology was accurate for strain measurements and cast distortion evaluation. There was no correlation between strain and marginal misfit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to compare the performance of the following techniques on the isolation of volatiles of importance for the aroma/flavor of fresh cashew apple juice: dynamic headspace analysis using PorapakQ(®) as trap, solvent extraction with and without further concentration of the isolate, and solid-phase microextraction (fiber DVB/CAR/PDMS). A total of 181 compounds were identified, from which 44 were esters, 20 terpenes, 19 alcohols, 17 hydrocarbons, 15 ketones, 14 aldehydes, among others. Sensory evaluation of the gas chromatography effluents revealed esters (n = 24) and terpenes (n = 10) as the most important aroma compounds. The four techniques were efficient in isolating esters, a chemical class of high impact in the cashew aroma/flavor. However, the dynamic headspace methodology produced an isolate in which the analytes were in greater concentration, which facilitates their identification (gas chromatography-mass spectrometry) and sensory evaluation in the chromatographic effluents. Solvent extraction (dichloromethane) without further concentration of the isolate was the most efficient methodology for the isolation of terpenes. Because these two techniques also isolated in greater concentration the volatiles from other chemical classes important to the cashew aroma, such as aldehydes and alcohols, they were considered the most advantageous for the study of cashew aroma/flavor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To perform a comparative evaluation of the mechanical resistance of simulated fractures of the mandibular body which were repaired using different fixation techniques with two different brands of 2.0 mm locking fixation systems. Four aluminum hemimandibles with linear sectioning simulating a mandibular body fracture were used as the substrates and were fixed using the two techniques and two different brands of fixation plate. These were divided into four groups: groups I and II were fixed with one four-hole plate, with four 6 mm screws in the tension zone and one four-hole plate, with four 10 mm screws in the compression zone; and groups III and IV were fixed with one four-hole plate with four 6 mm screws in the neutral zone. Fixation plates manufactured by Tóride were used for groups I and III, and by Traumec for groups II and IV. The hemimandibles were submitted to vertical, linear load testing in an Instron 4411 servohydraulic mechanical testing unit, and the load/displacement (3 mm, 5 mm and 7 mm) and the peak loads were measured. Means and standard deviations were evaluated applying variance analysis with a significance level of 5%. The only significant difference between the brands was seen at displacements of 7 mm. Comparing the techniques, groups I and II showed higher mechanical strength than groups III and IV, as expected. For the treatment of mandibular linear body fracture, two locking plates, one in the tension zone and another in the compression zone, have a greater mechanical strength than a single locking plate in the neutral zone.