14 resultados para Application of Data-driven Modelling in Water Sciences

em Instituto Politécnico do Porto, Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Maxwell equations, expressing the fundamental laws of electricity and magnetism, only involve the integer-order calculus. However, several effects present in electromagnetism, motivated recently an analysis under the fractional calculus (FC) perspective. In fact, this mathematical concept allows a deeper insight into many phenomena that classical models overlook. On the other hand, genetic algorithms (GA) are an important tool to solve optimization problems that occur in engineering. In this work we use FC and GA to implement the electrical potential of fractional order. The performance of the GA scheme and the convergence of the resulting approximations are analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Temporomandibular disorders (TMD) consist of a group of pathologies that affect the masticatory muscles, temporomandibular joints (TMJ), and/or related structures. String instrumentalists, like many orchestra musicians, can spend hours with head postures that may influence the biomechanical behavior of the TMJ and the muscles of the craniocervicomandibular complex (CCMC). The adoption of abnormal postures acquired during performance by musicians can lead to muscular hyperactivity of the head and cervical muscles, with the possible appearance of TMD. Medical infrared thermography is a non-invasive procedure that can monitor the changes in the superficial tissue related to blood circulation and may serve as a complement to the clinical examination. The objective of this study was to use infrared thermography to evaluate, in one subject, the cutaneous thermal changes adjacent to the CCMC that occur before, during, and after playing a string instrument.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Buildings and the whole built environment are in a key role when societies are mitigating climate change and adapting to its consequences. More than 50% of the existing residential buildings in EU-25 were built before 1970. Thus, these buildings are of significant importance in reducing energy consumption and CO2 emissions. The existence of more nearly zero energy buildings (nZEB) is a possible solution for this problem. This study aims to analyze the application of the nZEB methodology in the retrofitting of a typical Portuguese dwelling build in 1950. It was shown that the primary energy used can be reduced to a very low value (11,95 kWhep/m2.y) in comparison with the reference consumption (69,15 kWhep/m2.y), with the application of the best construction techniques together with the use of energy from on-site renewable sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fractional Calculus (FC) goes back to the beginning of the theory of differential calculus. Nevertheless, the application of FC just emerged in the last two decades. It has been recognized the advantageous use of this mathematical tool in the modelling and control of many dynamical systems. Having these ideas in mind, this paper discusses a FC perspective in the study of the dynamics and control of several systems. The paper investigates the use of FC in the fields of controller tuning, legged robots, electrical systems and digital circuit synthesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fractional Calculus (FC) goes back to the beginning of the theory of differential calculus. Nevertheless, the application of FC just emerged in the last two decades. In the field of dynamical systems theory some work has been carried out but the proposed models and algorithms are still in a preliminary stage of establishment. Having these ideas in mind, the paper discusses a FC perspective in the study of the dynamics and control of mechanical systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presented work was conducted within the Dissertation / Internship, branch of Environmental Protection Technology, associated to the Master thesis in Chemical Engineering by the Instituto Superior de Engenharia do Porto and it was developed in the Aquatest a.s, headquartered in Prague, in Czech Republic. The ore mining exploitation in the Czech Republic began in the thirteenth century, and has been extended until the twentieth century, being now evident the consequences of the intensive extraction which includes contamination of soil and sub-soil by high concentrations of heavy metals. The mountain region of Zlaté Hory was chosen for the implementation of the remediation project, which consisted in the construction of three cells (tanks), the first to raise the pH, the second for the sedimentation of the formed precipitates and a third to increase the process efficiency in order to reduce high concentrations of metals, with special emphasis on iron, manganese and sulfates. This project was initiated in 2005, being pioneer in this country and is still ongoing due to the complex chemical and biological phenomenon’s inherent to the system. At the site where the project was implemented, there is a natural lagoon, thereby enabling a comparative study of the two systems (natural and artificial) regarding the efficiency of both in the reduction/ removal of the referred pollutants. The study aimed to assist and cooperate in the ongoing investigation at the company Aquatest, in terms of field work conducted in Zlaté Hory and in terms of research methodologies used in it. Thereby, it was carried out a survey and analysis of available data from 2005 to 2008, being complemented by the treatment of new data from 2009 to 2010. Moreover, a theoretical study of the chemical and biological processes that occurs in both systems was performed. Regarding the field work, an active participation in the collection and in situ sample analyzing of water and soil from the natural pond has been attained, with the supervision of Engineer, Irena Šupiková. Laboratory analysis of water and soil were carried out by laboratory technicians. It was found that the natural lagoon is more efficient in reducing iron and manganese, being obtained removal percentages of 100%. The artificial lagoon had a removal percentage of 90% and 33% for iron and manganese respectively. Despite the minor efficiency of the constructed wetland, it must be pointed out that this system was designed for the treatment and consequent reduction of iron. In this context, it can conclude that the main goal has been achieved. In the case of sulphates, the removal optimization is yet a goal to be achieved not only in the Czech Republic but also in other places where this type of contamination persists. In fact, in the natural lagoon and in the constructed wetland, removal efficiencies of 45% and 7% were obtained respectively. It has been speculated that the water at the entrance of both systems has different sources. The analysis of the collected data shows at the entrance of the natural pond, a concentration of 4.6 mg/L of total iron, 14.6 mg/L of manganese and 951 mg/L of sulphates. In the artificial pond, the concentrations are 27.7 mg/L, 8.1 mg/L and 382 mg/L respectively for iron, manganese and sulphates. During 2010 the investigation has been expanded. The study of soil samples has started in order to observe and evaluate the contribution of bacteria in the removal of heavy metals being in its early phase. Summarizing, this technology has revealed to be an interesting solution, since in addition to substantially reduce the mentioned contaminants, mostly iron, it combines the low cost of implementation with an reduced maintenance, and it can also be installed in recreation parks, providing habitats for plants and birds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work was to develop an application capable of determining the diffusion times and diffusion coefficients of optical clearing agents and water inside a known type of muscle. Different types of chemical agents can also be used with the method implemented, such as medications or metabolic products. Since the diffusion times can be calculated, it is possible to describe the dehydration mechanism that occurs in the muscle. The calculation of the diffusion time of an optical clearing agent allows to characterize the refractive index matching mechanism of optical clearing. By using both the diffusion times and diffusion of water and clearing agents not only the optical clearing mechanisms are characterized, but also information about optical clearing effect duration and magnitude is obtained. Such information is crucial to plan a clinical intervention in cooperation with optical clearing. The experimental method and equations implemented in the developed application are described in throughout this document, demonstrating its effectiveness. The application was developed in MATLAB code, but the method was personalized so it better fits the application needs. This process significantly improved the processing efficiency, reduced the time to obtain he results, multiple validations prevents common errors and some extra functionalities were added such as saving application progress or export information in different formats. Tests were made using glucose measurements in muscle. Some of the data, for testing purposes, was also intentionally changed in order to obtain different simulations and results from the application. The entire project was validated by comparing the calculated results with the ones found in literature, which are also described in this document.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the use of integer and fractional electrical elements, for modelling two electrochemical systems. A first type of system consists of botanical elements and a second type is implemented by electrolyte processes with fractal electrodes. Experimental results are analyzed in the frequency domain, and the pros and cons of adopting fractional-order electrical components for modelling these systems are compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been shown that in reality at least two general scenarios of data structuring are possible: (a) a self-similar (SS) scenario when the measured data form an SS structure and (b) a quasi-periodic (QP) scenario when the repeated (strongly correlated) data form random sequences that are almost periodic with respect to each other. In the second case it becomes possible to describe their behavior and express a part of their randomness quantitatively in terms of the deterministic amplitude–frequency response belonging to the generalized Prony spectrum. This possibility allows us to re-examine the conventional concept of measurements and opens a new way for the description of a wide set of different data. In particular, it concerns different complex systems when the ‘best-fit’ model pretending to be the description of the data measured is absent but the barest necessity of description of these data in terms of the reduced number of quantitative parameters exists. The possibilities of the proposed approach and detection algorithm of the QP processes were demonstrated on actual data: spectroscopic data recorded for pure water and acoustic data for a test hole. The suggested methodology allows revising the accepted classification of different incommensurable and self-affine spatial structures and finding accurate interpretation of the generalized Prony spectroscopy that includes the Fourier spectroscopy as a partial case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prostate Specific Antigen (PSA) is the biomarker of choice for screening prostate cancer throughout the population, with PSA values above 10 ng/mL pointing out a high probability of associated cancer1. According to the most recent World Health Organization (WHO) data, prostate cancer is the commonest form of cancer in men in Europe2. Early detection of prostate cancer is thus very important and is currently made by screening PSA in men over 45 years old, combined with other alterations in serum and urine parameters. PSA is a glycoprotein with a molecular mass of approximately 32 kDa consisting of one polypeptide chain, which is produced by the secretory epithelium of human prostate. Currently, the standard methods available for PSA screening are immunoassays like Enzyme-Linked Immunoabsorbent Assay (ELISA). These methods are highly sensitive and specific for the detection of PSA, but they require expensive laboratory facilities and high qualify personal resources. Other highly sensitive and specific methods for the detection of PSA have also become available and are in its majority immunobiosensors1,3-5, relying on antibodies. Less expensive methods producing quicker responses are thus needed, which may be achieved by synthesizing artificial antibodies by means of molecular imprinting techniques. These should also be coupled to simple and low cost devices, such as those of the potentiometric kind, one approach that has been proven successful6. Potentiometric sensors offer the advantage of selectivity and portability for use in point-of-care and have been widely recognized as potential analytical tools in this field. The inherent method is simple, precise, accurate and inexpensive regarding reagent consumption and equipment involved. Thus, this work proposes a new plastic antibody for PSA, designed over the surface of graphene layers extracted from graphite. Charged monomers were used to enable an oriented tailoring of the PSA rebinding sites. Uncharged monomers were used as control. These materials were used as ionophores in conventional solid-contact graphite electrodes. The obtained results showed that the imprinted materials displayed a selective response to PSA. The electrodes with charged monomers showed a more stable and sensitive response, with an average slope of -44.2 mV/decade and a detection limit of 5.8X10-11 mol/L (2 ng/mL). The corresponding non-imprinted sensors showed smaller sensitivity, with average slopes of -24.8 mV/decade. The best sensors were successfully applied to the analysis of serum samples, with percentage recoveries of 106.5% and relatives errors of 6.5%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.