25 resultados para N epoch application

em Universidade do Minho


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a mobile information system denominated as Vehicle-to-Anything Application (V2Anything App), and explains its conceptual aspects. This application is aimed at giving relevant information to Full Electric Vehicle (FEV) drivers, by supporting the integration of several sources of data in a mobile application, thus contributing to the deployment of the electric mobility process. The V2Anything App provides recommendations to the drivers about the FEV range autonomy, location of battery charging stations, information of the electricity market, and also a route planner taking into account public transportations and car or bike sharing systems. The main contributions of this application are related with the creation of an Information and Communication Technology (ICT) platform, recommender systems, data integration systems, driver profile, and personalized range prediction. Thus, it is possible to deliver relevant information to the FEV drivers related with the electric mobility process, electricity market, public transportation, and the FEV performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

\The idea that social processes develop in a cyclical manner is somewhat like a `Lorelei'. Researchers are lured to it because of its theoretical promise, only to become entangled in (if not wrecked by) messy problems of empirical inference. The reasoning leading to hypotheses of some kind of cycle is often elegant enough, yet the data from repeated observations rarely display the supposed cyclical pattern. (...) In addition, various `schools' seem to exist which frequently arrive at di erent conclusions on the basis of the same data." (van der Eijk and Weber 1987:271). Much of the empirical controversies around these issues arise because of three distinct problems: the coexistence of cycles of di erent periodicities, the possibility of transient cycles and the existence of cycles without xed periodicity. In some cases, there are no reasons to expect any of these phenomena to be relevant. Seasonality caused by Christmas is one such example (Wen 2002). In such cases, researchers mostly rely on spectral analysis and Auto-Regressive Moving-Average (ARMA) models to estimate the periodicity of cycles.1 However, and this is particularly true in social sciences, sometimes there are good theoretical reasons to expect irregular cycles. In such cases, \the identi cation of periodic movement in something like the vote is a daunting task all by itself. When a pendulum swings with an irregular beat (frequency), and the extent of the swing (amplitude) is not constant, mathematical functions like sine-waves are of no use."(Lebo and Norpoth 2007:73) In the past, this di culty has led to two di erent approaches. On the one hand, some researchers dismissed these methods altogether, relying on informal alternatives that do not meet rigorous standards of statistical inference. Goldstein (1985 and 1988), studying the severity of Great power wars is one such example. On the other hand, there are authors who transfer the assumptions of spectral analysis (and ARMA models) into fundamental assumptions about the nature of social phenomena. This type of argument was produced by Beck (1991) who, in a reply to Goldstein (1988), claimed that only \ xed period models are meaningful models of cyclic phenomena".We argue that wavelet analysis|a mathematical framework developed in the mid-1980s (Grossman and Morlet 1984; Goupillaud et al. 1984) | is a very viable alternative to study cycles in political time-series. It has the advantage of staying close to the frequency domain approach of spectral analysis while addressing its main limitations. Its principal contribution comes from estimating the spectral characteristics of a time-series as a function of time, thus revealing how its di erent periodic components may change over time. The rest of article proceeds as follows. In the section \Time-frequency Analysis", we study in some detail the continuous wavelet transform and compare its time-frequency properties with the more standard tool for that purpose, the windowed Fourier transform. In the section \The British Political Pendulum", we apply wavelet analysis to essentially the same data analyzed by Lebo and Norpoth (2007) and Merrill, Grofman and Brunell (2011) and try to provide a more nuanced answer to the same question discussed by these authors: do British electoral politics exhibit cycles? Finally, in the last section, we present a concise list of future directions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainability is frequently defined by its three pillars: economically viable, socially equitable, and environmentally bearable. Consequently the evaluation of the sustainability of any decision, public or private, requires information on these three dimensions. This paper focuses on social sustainability. In the context of renewable energy sources, the examination of social sustainability requires the analysis of not only the efficiency but also the equity of its welfare impacts. The present paper proposes and applies a methodology to generate the information necessary to do a proper welfare analysis of the social sustainability of renewable energy production facilities. This information is key both for an equity and an efficiency analysis. The analysis focuses on the case of investments in renewable energy electricity production facilities, where the impacts on local residents’ welfare are often significantly different than the welfare effects on the general population. We apply the contingent valuation method to selected facilities across the different renewable energy power plants located in Portugal and conclude that local residents acknowledge differently the damage sustained by the type, location and operation of the plants. The results from these case studies attest to the need of acknowledging and quantifying the negative impacts on local communities when assessing the economic viability, social equity and environmental impact of renewable energy projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hospitals are nowadays collecting vast amounts of data related with patient records. All this data hold valuable knowledge that can be used to improve hospital decision making. Data mining techniques aim precisely at the extraction of useful knowledge from raw data. This work describes an implementation of a medical data mining project approach based on the CRISP-DM methodology. Recent real-world data, from 2000 to 2013, were collected from a Portuguese hospital and related with inpatient hospitalization. The goal was to predict generic hospital Length Of Stay based on indicators that are commonly available at the hospitalization process (e.g., gender, age, episode type, medical specialty). At the data preparation stage, the data were cleaned and variables were selected and transformed, leading to 14 inputs. Next, at the modeling stage, a regression approach was adopted, where six learning methods were compared: Average Prediction, Multiple Regression, Decision Tree, Artificial Neural Network ensemble, Support Vector Machine and Random Forest. The best learning model was obtained by the Random Forest method, which presents a high quality coefficient of determination value (0.81). This model was then opened by using a sensitivity analysis procedure that revealed three influential input attributes: the hospital episode type, the physical service where the patient is hospitalized and the associated medical specialty. Such extracted knowledge confirmed that the obtained predictive model is credible and with potential value for supporting decisions of hospital managers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The acoustic emission (AE) technique is used for investigating the interfacial fracture and damage propagation in GFRP-and SRG-strengthened bricks during debonding tests. The bond behavior is investigated through single-lap shear bond tests and the fracture progress during the tests is recorded by means of AE sensors. The fracture progress and active debonding mechanisms are characterized in both specimen types with the aim of AE outputs. Moreover, a clear distinction between the AE outputs of specimens with different failure modes, in both SRG-and GFRP-strengthened specimens, is found which allows characterizing the debonding failure mode based on acoustic emission data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timber frame buildings are well known as an efficient seismic resistant structure popular all over the world not only due to their seismic performance, but also to their low cost and the strength they offer. These constructions still exist today and it is important to be able to preserve them, so a better knowledge on their behaviour is sought. Furthermore, historic technologies could be used even in modern constructions to build seismic resistant buildings using more natural materials with lesser costs. A great rehabilitation effort is being carried out on this type of buildings, as their neglect has led to decay or their change in use and alterations to the structure has led to the need to retrofit such buildings; only recently studies on their behaviour have become available and only a few of them address the issue of possible strengthening techniques for this kind of walls. In this scope, an innovative retrofitting technique (near surface mounted steel flat bars) is proposed and validated on traditional timber frame walls based on an extensive experimental program. The results of the static cyclic tests on distinct wall typologies retrofitted with the NSM technique are herein presented and discussed in detail. The main features on deformation, lateral stiffness, lateral resistance and seismic performance indexes are analysed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper focuses on a damage identification method based on the use of the second order spectral properties of the nodal response processes. The explicit dependence on the frequency content of the outputs power spectral densities makes them suitable for damage detection and localization. The well-known case study of the Z24 Bridge in Switzerland is chosen to apply and further investigate this technique with the aim of validating its reliability. Numerical simulations of the dynamic response of the structure subjected to different types of excitation are carried out to assess the variability of the spectrum-driven method with respect to both type and position of the excitation sources. The simulated data obtained from random vibrations, impulse, ramp and shaking forces, allowed to build the power spectrum matrix from which the main eigenparameters of reference and damage scenarios are extracted. Afterwards, complex eigenvectors and real eigenvalues are properly weighed and combined and a damage index based on the difference between spectral modes is computed to pinpoint the damage. Finally, a group of vibration-based damage identification methods are selected from the literature to compare the results obtained and to evaluate the performance of the spectral index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rockburst is characterized by a violent explosion of a block causing a sudden rupture in the rock and is quite common in deep tunnels. It is critical to understand the phenomenon of rockburst, focusing on the patterns of occurrence so these events can be avoided and/or managed saving costs and possibly lives. The failure mechanism of rockburst needs to be better understood. Laboratory experiments are undergoing at the Laboratory for Geomechanics and Deep Underground Engineering (SKLGDUE) of Beijing and the system is described. A large number of rockburst tests were performed and their information collected, stored in a database and analyzed. Data Mining (DM) techniques were applied to the database in order to develop predictive models for the rockburst maximum stress (σRB) and rockburst risk index (IRB) that need the results of such tests to be determined. With the developed models it is possible to predict these parameters with high accuracy levels using data from the rock mass and specific project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new very high-order finite volume method to solve problems with harmonic and biharmonic operators for one- dimensional geometries is proposed. The main ingredient is polynomial reconstruction based on local interpolations of mean values providing accurate approximations of the solution up to the sixth-order accuracy. First developed with the harmonic operator, an extension for the biharmonic operator is obtained, which allows designing a very high-order finite volume scheme where the solution is obtained by solving a matrix-free problem. An application in elasticity coupling the two operators is presented. We consider a beam subject to a combination of tensile and bending loads, where the main goal is the stress critical point determination for an intramedullary nail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to water scarcity, it is important to organize and regulate water resources utilization to satisfy the conflicting water demands and needs. This paper aims to describe a comprehensive methodology for managing the water sector of a defined urbanized region, using the robust capabilities of a Geographic Information System (GIS). The proposed methodology is based on finding alternatives to cover the gap between recent supplies and future demands. Nablus which is a main governorate located in the north of West Bank, Palestine, was selected as case study because this area is classified as arid to semi-arid area. In fact, GIS integrates hardware, software, and data for capturing, managing, analyzing, and displaying all forms of geographic information. The resulted plan of Nablus represents an example of the proposed methodology implementation and a valid framework for the elaboration of a water master plan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an improved version of an application whose goal is to provide a simple and intuitive way to use multicriteria decision methods in day-to-day decision problems. The application allows comparisons between several alternatives with several criteria, always keeping a permanent backup of both model and results, and provides a framework to incorporate new methods in the future. Developed in C#, the application implements the AHP, SMART and Value Functions methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.