10 resultados para Application of Data-driven Modelling in Water Sciences

em Universidade do Minho


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rockburst is characterized by a violent explosion of a block causing a sudden rupture in the rock and is quite common in deep tunnels. It is critical to understand the phenomenon of rockburst, focusing on the patterns of occurrence so these events can be avoided and/or managed saving costs and possibly lives. The failure mechanism of rockburst needs to be better understood. Laboratory experiments are undergoing at the Laboratory for Geomechanics and Deep Underground Engineering (SKLGDUE) of Beijing and the system is described. A large number of rockburst tests were performed and their information collected, stored in a database and analyzed. Data Mining (DM) techniques were applied to the database in order to develop predictive models for the rockburst maximum stress (σRB) and rockburst risk index (IRB) that need the results of such tests to be determined. With the developed models it is possible to predict these parameters with high accuracy levels using data from the rock mass and specific project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As huge amounts of data become available in organizations and society, specific data analytics skills and techniques are needed to explore this data and extract from it useful patterns, tendencies, models or other useful knowledge, which could be used to support the decision-making process, to define new strategies or to understand what is happening in a specific field. Only with a deep understanding of a phenomenon it is possible to fight it. In this paper, a data-driven analytics approach is used for the analysis of the increasing incidence of fatalities by pneumonia in the Portuguese population, characterizing the disease and its incidence in terms of fatalities, knowledge that can be used to define appropriate strategies that can aim to reduce this phenomenon, which has increased more than 65% in a decade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, prebiotics are all carbohydrates of relatively short chain length. An important group is the fructooligosaccharides, which are a special kind of prebiotics associated to their selective stimulation of the activity of certain groups of colonic bacteria that have a positive and beneficial effect on intestinal microbiota, reducing incidence of gastrointestinal infections, respiratory and also possessing a recognized bifidogenic effect. Traditionally, these prebiotic compounds have been obtained through extraction processes from some plants, as well as through enzymatic hydrolysis of sucrose. However, different fermentative methods have also been proposed for the production of fructooligosaccharides, such as solid-state fermentation utilizing various agroindustrial by-products. By optimizing the culture parameters, fructooligosaccharides yields and productivity can be improved. The use of immobilized enzymes and cells has also been proposed as being an effective and economic method for large-scale production of fructooligosaccharides. This paper is an overview on the results of recent studies on fructooligosacharides biosynthesis, physicochemical properties, sources, biotechnological production and applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to water scarcity, it is important to organize and regulate water resources utilization to satisfy the conflicting water demands and needs. This paper aims to describe a comprehensive methodology for managing the water sector of a defined urbanized region, using the robust capabilities of a Geographic Information System (GIS). The proposed methodology is based on finding alternatives to cover the gap between recent supplies and future demands. Nablus which is a main governorate located in the north of West Bank, Palestine, was selected as case study because this area is classified as arid to semi-arid area. In fact, GIS integrates hardware, software, and data for capturing, managing, analyzing, and displaying all forms of geographic information. The resulted plan of Nablus represents an example of the proposed methodology implementation and a valid framework for the elaboration of a water master plan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

\The idea that social processes develop in a cyclical manner is somewhat like a `Lorelei'. Researchers are lured to it because of its theoretical promise, only to become entangled in (if not wrecked by) messy problems of empirical inference. The reasoning leading to hypotheses of some kind of cycle is often elegant enough, yet the data from repeated observations rarely display the supposed cyclical pattern. (...) In addition, various `schools' seem to exist which frequently arrive at di erent conclusions on the basis of the same data." (van der Eijk and Weber 1987:271). Much of the empirical controversies around these issues arise because of three distinct problems: the coexistence of cycles of di erent periodicities, the possibility of transient cycles and the existence of cycles without xed periodicity. In some cases, there are no reasons to expect any of these phenomena to be relevant. Seasonality caused by Christmas is one such example (Wen 2002). In such cases, researchers mostly rely on spectral analysis and Auto-Regressive Moving-Average (ARMA) models to estimate the periodicity of cycles.1 However, and this is particularly true in social sciences, sometimes there are good theoretical reasons to expect irregular cycles. In such cases, \the identi cation of periodic movement in something like the vote is a daunting task all by itself. When a pendulum swings with an irregular beat (frequency), and the extent of the swing (amplitude) is not constant, mathematical functions like sine-waves are of no use."(Lebo and Norpoth 2007:73) In the past, this di culty has led to two di erent approaches. On the one hand, some researchers dismissed these methods altogether, relying on informal alternatives that do not meet rigorous standards of statistical inference. Goldstein (1985 and 1988), studying the severity of Great power wars is one such example. On the other hand, there are authors who transfer the assumptions of spectral analysis (and ARMA models) into fundamental assumptions about the nature of social phenomena. This type of argument was produced by Beck (1991) who, in a reply to Goldstein (1988), claimed that only \ xed period models are meaningful models of cyclic phenomena".We argue that wavelet analysis|a mathematical framework developed in the mid-1980s (Grossman and Morlet 1984; Goupillaud et al. 1984) | is a very viable alternative to study cycles in political time-series. It has the advantage of staying close to the frequency domain approach of spectral analysis while addressing its main limitations. Its principal contribution comes from estimating the spectral characteristics of a time-series as a function of time, thus revealing how its di erent periodic components may change over time. The rest of article proceeds as follows. In the section \Time-frequency Analysis", we study in some detail the continuous wavelet transform and compare its time-frequency properties with the more standard tool for that purpose, the windowed Fourier transform. In the section \The British Political Pendulum", we apply wavelet analysis to essentially the same data analyzed by Lebo and Norpoth (2007) and Merrill, Grofman and Brunell (2011) and try to provide a more nuanced answer to the same question discussed by these authors: do British electoral politics exhibit cycles? Finally, in the last section, we present a concise list of future directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hospitals are nowadays collecting vast amounts of data related with patient records. All this data hold valuable knowledge that can be used to improve hospital decision making. Data mining techniques aim precisely at the extraction of useful knowledge from raw data. This work describes an implementation of a medical data mining project approach based on the CRISP-DM methodology. Recent real-world data, from 2000 to 2013, were collected from a Portuguese hospital and related with inpatient hospitalization. The goal was to predict generic hospital Length Of Stay based on indicators that are commonly available at the hospitalization process (e.g., gender, age, episode type, medical specialty). At the data preparation stage, the data were cleaned and variables were selected and transformed, leading to 14 inputs. Next, at the modeling stage, a regression approach was adopted, where six learning methods were compared: Average Prediction, Multiple Regression, Decision Tree, Artificial Neural Network ensemble, Support Vector Machine and Random Forest. The best learning model was obtained by the Random Forest method, which presents a high quality coefficient of determination value (0.81). This model was then opened by using a sensitivity analysis procedure that revealed three influential input attributes: the hospital episode type, the physical service where the patient is hospitalized and the associated medical specialty. Such extracted knowledge confirmed that the obtained predictive model is credible and with potential value for supporting decisions of hospital managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The acoustic emission (AE) technique is used for investigating the interfacial fracture and damage propagation in GFRP-and SRG-strengthened bricks during debonding tests. The bond behavior is investigated through single-lap shear bond tests and the fracture progress during the tests is recorded by means of AE sensors. The fracture progress and active debonding mechanisms are characterized in both specimen types with the aim of AE outputs. Moreover, a clear distinction between the AE outputs of specimens with different failure modes, in both SRG-and GFRP-strengthened specimens, is found which allows characterizing the debonding failure mode based on acoustic emission data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lipid nanoballoons integrating multiple emulsions of the type water-in-oil-in-water enclose, at least in theory, a biomimetic aqueous-core suitable for housing hydrophilic biomolecules such as proteins, peptides and bacteriophage particles. The research effort entertained in this paper reports a full statistical 23x31 factorial design study (three variables at two levels and one variable at three levels) to optimize biomimetic aqueous-core lipid nanoballoons for housing hydrophilic protein entities. The concentrations of protein, lipophilic and hydrophilic emulsifiers, and homogenization speed were set as the four independent variables, whereas the mean particle hydrodynamic size (HS), zeta potential (ZP) and polydispersity index (PI) were set as the dependent variables. The V23x31 factorial design constructed led to optimization of the higher (+1) and lower (-1) levels, with triplicate testing for the central (0) level, thus producing thirty three experiments and leading to selection of the optimized processing parameters as 0.015% (w/w) protein entity, 0.75% (w/w) lipophilic emulsifier (soybean lecithin) and 0.50% (w/w) hydrophilic emulsifier (poloxamer 188). In the present research effort, statistical optimization and production of protein derivatives encompassing full stabilization of their three-dimensional structure, has been attempted via housing said molecular entities within biomimetic aqueous-core lipid nanoballoons integrating a multiple (W/O/W) emulsion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The equivalent annulus width concept is used to characterize a small commercial thermogravitational hermal diffusion column and its validity checked experimentally by separating batchwise in the column mixtures of n-heptanebenzene with different initial concentrations. The equation of Ruppell and Coull was used to analyse the data in the short separation times range and determine the equivalent annulus width. Good agreement was obtained between the experimental and predicted time-separation curves when using the equivalent annulus width value and on averaged value of the thermal diffusion constant. A new method is presented for the simultaneous determination of the equivalent annulus width and the thermal diffusion constant of a binary mixture from a single set of experimental data.