31 resultados para LABORATORY MODELS
em Universidade do Minho
Resumo:
Rockburst is characterized by a violent explosion of a block causing a sudden rupture in the rock and is quite common in deep tunnels. It is critical to understand the phenomenon of rockburst, focusing on the patterns of occurrence so these events can be avoided and/or managed saving costs and possibly lives. The failure mechanism of rockburst needs to be better understood. Laboratory experiments are undergoing at the Laboratory for Geomechanics and Deep Underground Engineering (SKLGDUE) of Beijing and the system is described. A large number of rockburst tests were performed and their information collected, stored in a database and analyzed. Data Mining (DM) techniques were applied to the database in order to develop predictive models for the rockburst maximum stress (σRB) and rockburst risk index (IRB) that need the results of such tests to be determined. With the developed models it is possible to predict these parameters with high accuracy levels using data from the rock mass and specific project.
Resumo:
"A workshop within the 19th International Conference on Applications and Theory of Petri Nets - ICATPN’1998"
Resumo:
The needs of reducing human error has been growing in every field of study, and medicine is one of those. Through the implementation of technologies is possible to help in the decision making process of clinics, therefore to reduce the difficulties that are typically faced. This study focuses on easing some of those difficulties by presenting real-time data mining models capable of predicting if a monitored patient, typically admitted in intensive care, will need to take vasopressors. Data Mining models were induced using clinical variables such as vital signs, laboratory analysis, among others. The best model presented a sensitivity of 94.94%. With this model it is possible reducing the misuse of vasopressors acting as prevention. At same time it is offered a better care to patients by anticipating their treatment with vasopressors.
Resumo:
This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.
Resumo:
This article presents an experimental and numerical study for the mechanical characterization under uniaxial compressive loading of the adobe masonry of one of the most emblematic archaeological complex in Peru, 'Huaca de la Luna' (100-650AD). Compression tests of prisms were carried out with original material brought to the laboratory. For measuring local deformations in the tests, displacement transducers were used which were complemented by a digital image correlation system which allowed a better understanding of the failure mechanism. The tests were then numerically simulated by modelling the masonry as a continuum media. Several approaches were considered concerning the geometrical modelling, namely 2D and 3D simplified models, and 3D refined models based on a photogrammetric reconstruction. The results showed a good approximation between the numerical prediction and the experimental response in all cases. However, the 3D models with irregular geometries seem to reproduce better the cracking pattern observed in the tests.
Resumo:
The occupational risks in the nanotechnology research laboratories are an important topic since a great number of researchers are involved in this area. The risk assessment performed by both qualitative and quantitative methods is a necessary step for the management of the occupational risks. Risk assessment could be performed by qualitative methods that gather consensus in the scientific community. It is also possible to use quantitative methods, based in different technics and metrics, as indicative exposure limits are been settled by several institutions. While performing the risk assessment, the information on the materials used is very important and, if it is not updated, it could create a bias in the assessment results. The exposure to TiO2 nanoparticles risk was assessed in a research laboratory using a quantitative exposure method and qualitative risk assessment methods. It was found the results from direct-reading Condensation Particle Counter (CPC) equipment and the CB Nanotool seem to be related and aligned, while the results obtained from the use of the Stoffenmanager Nano seem to indicate a higher risk level.
Resumo:
Developing and implementing data-oriented workflows for data migration processes are complex tasks involving several problems related to the integration of data coming from different schemas. Usually, they involve very specific requirements - every process is almost unique. Having a way to abstract their representation will help us to better understand and validate them with business users, which is a crucial step for requirements validation. In this demo we present an approach that provides a way to enrich incrementally conceptual models in order to support an automatic way for producing their correspondent physical implementation. In this demo we will show how B2K (Business to Kettle) system works transforming BPMN 2.0 conceptual models into Kettle data-integration executable processes, approaching the most relevant aspects related to model design and enrichment, model to system transformation, and system execution.
Resumo:
ETL conceptual modeling is a very important activity in any data warehousing system project implementation. Owning a high-level system representation allowing for a clear identification of the main parts of a data warehousing system is clearly a great advantage, especially in early stages of design and development. However, the effort to model conceptually an ETL system rarely is properly rewarded. Translating ETL conceptual models directly into something that saves work and time on the concrete implementation of the system process it would be, in fact, a great help. In this paper we present and discuss a hybrid approach to this problem, combining the simplicity of interpretation and power of expression of BPMN on ETL systems conceptualization with the use of ETL patterns to produce automatically an ETL skeleton, a first prototype system, which has the ability to be executed in a commercial ETL tool like Kettle.
Resumo:
This work reports the implementation and verification of a new so lver in OpenFOAM® open source computational library, able to cope with integral viscoelastic models based on the integral upper-convected Maxwell model. The code is verified through the comparison of its predictions with analytical solutions and numerical results obtained with the differential upper-convected Maxwell model
Resumo:
This review deals with the recent developments and present status of the theoretical models for the simulation of the performance of lithium ion batteries. Preceded by a description of the main materials used for each of the components of a battery -anode, cathode and separator- and how material characteristics affect battery performance, a description of the main theoretical models describing the operation and performance of a battery are presented. The influence of the most relevant parameters of the models, such as boundary conditions, geometry and material characteristics are discussed. Finally, suggestions for future work are proposed.
Resumo:
Extreme value models are widely used in different areas. The Birnbaum–Saunders distribution is receiving considerable attention due to its physical arguments and its good properties. We propose a methodology based on extreme value Birnbaum–Saunders regression models, which includes model formulation, estimation, inference and checking. We further conduct a simulation study for evaluating its performance. A statistical analysis with real-world extreme value environmental data using the methodology is provided as illustration.
Resumo:
Depression is an extremely heterogeneous disorder. Diverse molecular mechanisms have been suggested to underlie its etiology. To understand the molecular mechanisms responsible for this complex disorder, researchers have been using animal models extensively, namely mice from various genetic backgrounds and harboring distinct genetic modifications. The use of numerous mouse models has contributed to enrich our knowledge on depression. However, accumulating data also revealed that the intrinsic characteristics of each mouse strain might influence the experimental outcomes, which may justify some conflicting evidence reported in the literature. To further understand the impact of the genetic background, we performed a multimodal comparative study encompassing the most relevant parameters commonly addressed in depression, in three of the most widely used mouse strains: Balb/c, C57BL/6, and CD-1. Moreover, female mice were selected for this study taken into account the higher prevalence of depression in women and the fewer animal studies using this gender. Our results show that Balb/c mice have a more pronounced anxious-like behavior than CD-1 and C57BL/6 mice, whereas C57BL/6 animals present the strongest depressive-like trait. Furthermore, C57BL/6 mice display the highest rate of proliferating cells and brain-derived neurotrophic factor (Bdnf) expression levels in the hippocampus, while hippocampal dentate granular neurons of Balb/c mice show smaller dendritic lengths and fewer ramifications. Of notice, the expression levels of inducible nitric oxide synthase (iNos) predict 39.5% of the depressive-like behavior index, which suggests a key role of hippocampal iNOS in depression. Overall, this study reveals important interstrain differences in several behavioral dimensions and molecular and cellular parameters that should be considered when preparing and analyzing experiments addressing depression using mouse models. It further contributes to the literature by revealing the predictive value of hippocampal iNos expression levels in depressive-like behavior, irrespectively of the mouse strain.
Resumo:
We survey results about exact cylindrically symmetric models of gravitational collapse in General Relativity. We focus on models which result from the matching of two spacetimes having collapsing interiors which develop trapped surfaces and vacuum exteriors containing gravitational waves. We collect some theorems from the literature which help to decide a priori about eventual spacetime matchings. We revise, in more detail, some toy models which include some of the main mathematical and physical issues that arise in this context, and compute the gravitational energy flux through the matching boundary of a particular collapsing region. Along the way, we point out several interesting open problems.
Resumo:
The chemical composition of propolis is affected by environmental factors and harvest season, making it difficult to standardize its extracts for medicinal usage. By detecting a typical chemical profile associated with propolis from a specific production region or season, certain types of propolis may be used to obtain a specific pharmacological activity. In this study, propolis from three agroecological regions (plain, plateau, and highlands) from southern Brazil, collected over the four seasons of 2010, were investigated through a novel NMR-based metabolomics data analysis workflow. Chemometrics and machine learning algorithms (PLS-DA and RF), including methods to estimate variable importance in classification, were used in this study. The machine learning and feature selection methods permitted construction of models for propolis sample classification with high accuracy (>75%, reaching 90% in the best case), better discriminating samples regarding their collection seasons comparatively to the harvest regions. PLS-DA and RF allowed the identification of biomarkers for sample discrimination, expanding the set of discriminating features and adding relevant information for the identification of the class-determining metabolites. The NMR-based metabolomics analytical platform, coupled to bioinformatic tools, allowed characterization and classification of Brazilian propolis samples regarding the metabolite signature of important compounds, i.e., chemical fingerprint, harvest seasons, and production regions.
Resumo:
In this article, we develop a specification technique for building multiplicative time-varying GARCH models of Amado and Teräsvirta (2008, 2013). The variance is decomposed into an unconditional and a conditional component such that the unconditional variance component is allowed to evolve smoothly over time. This nonstationary component is defined as a linear combination of logistic transition functions with time as the transition variable. The appropriate number of transition functions is determined by a sequence of specification tests. For that purpose, a coherent modelling strategy based on statistical inference is presented. It is heavily dependent on Lagrange multiplier type misspecification tests. The tests are easily implemented as they are entirely based on auxiliary regressions. Finite-sample properties of the strategy and tests are examined by simulation. The modelling strategy is illustrated in practice with two real examples: an empirical application to daily exchange rate returns and another one to daily coffee futures returns.