887 resultados para Full scale testing
Resumo:
Experimental wind tunnel and smoke visualisation testing and CFD modelling were conducted to investigate the effect of air flow control mechanism and heat source inside rooms on wind catchers/towers performance. For this purpose, a full-scale wind catcher was connected to a test room and positioned centrally in an open boundary wind tunnel. Pressure coefficients (C-p's) around the wind catcher and air flow into the test room were established. The performance of the wind catcher depends greatly on the wind speed and direction. The incorporation of dampers and egg crate grille at ceiling level reduces and regulates the air flow rate with an average pressure loss coefficient of 0.01. The operation of the wind catcher in the presence of heat sources will potentially lower the internal temperatures in line with the external temperatures.
Resumo:
Wind catcher systems have been employed in buildings in the Middle East for many centuries and they are known by different names in different parts of the region. Recently there has been an increase in the application of this approach for natural ventilation and passive cooling in the UK and other countries. This paper presents the results of experimental wind tunnel and smoke visualisation testing, combined with CFD modelling, to investigate the performance of the wind catcher. For this purpose, a full-scale commercial system was connected to a test room and positioned centrally in an open boundary wind tunnel. Because much ventilation design involves the use of computational fluid dynamics, the measured performance of the system was also compared against the results of CFD analysis. Configurations included both a heated and unheated space to determine the impact of internal heat sources on airflow rate. Good comparisons between measurement and CFD analysis were obtained. Measurements showed that sufficient air change could be achieved to meet both air quality needs and passive cooling.
Resumo:
In the 1990s the Message Passing Interface Forum defined MPI bindings for Fortran, C, and C++. With the success of MPI these relatively conservative languages have continued to dominate in the parallel computing community. There are compelling arguments in favour of more modern languages like Java. These include portability, better runtime error checking, modularity, and multi-threading. But these arguments have not converted many HPC programmers, perhaps due to the scarcity of full-scale scientific Java codes, and the lack of evidence for performance competitive with C or Fortran. This paper tries to redress this situation by porting two scientific applications to Java. Both of these applications are parallelized using our thread-safe Java messaging system—MPJ Express. The first application is the Gadget-2 code, which is a massively parallel structure formation code for cosmological simulations. The second application uses the finite-domain time-difference method for simulations in the area of computational electromagnetics. We evaluate and compare the performance of the Java and C versions of these two scientific applications, and demonstrate that the Java codes can achieve performance comparable with legacy applications written in conventional HPC languages. Copyright © 2009 John Wiley & Sons, Ltd.
Resumo:
This paper presents a completely new design of a bogie-frame made of glass fibre reinforced composites and its performance under various loading conditions predicted by finite element analysis. The bogie consists of two frames, with one placed on top of the other, and two axle ties connecting the axles. Each frame consists of two side arms and a transom between. The top frame is thinner and more compliant and has a higher curvature compared with the bottom frame. Variable vertical stiffness can be achieved before and after the contact between the two frames at the central section of the bogie to cope with different load levels. Finite element analysis played a very important role in the design of this structure. Stiffness and stress levels of the full scale bogie presented in this paper under various loading conditions have been predicted by using Marc provided by MSC Software. In order to verify the finite element analysis (FEA) models, a fifth scale prototype of the bogie has been made and tested under quasi-static loading conditions. Results of testing on the fifth scale bogie have been used to fine tune details like contact and friction in the fifth scale FEA models. These conditions were then applied to the full scale models. Finite element analysis results show that the stress levels in all directions are low compared with material strengths.
Resumo:
New business and technology platforms are required to sustainably manage urban water resources [1,2]. However, any proposed solutions must be cognisant of security, privacy and other factors that may inhibit adoption and hence impact. The FP7 WISDOM project (funded by the European Commission - GA 619795) aims to achieve a step change in water and energy savings via the integration of innovative Information and Communication Technologies (ICT) frameworks to optimize water distribution networks and to enable change in consumer behavior through innovative demand management and adaptive pricing schemes [1,2,3]. The WISDOM concept centres on the integration of water distribution, sensor monitoring and communication systems coupled with semantic modelling (using ontologies, potentially connected to BIM, to serve as intelligent linkages throughout the entire framework) and control capabilities to provide for near real-time management of urban water resources. Fundamental to this framework are the needs and operational requirements of users and stakeholders at domestic, corporate and city levels and this requires the interoperability of a number of demand and operational models, fed with data from diverse sources such as sensor networks and crowsourced information. This has implications regarding the provenance and trustworthiness of such data and how it can be used in not only the understanding of system and user behaviours, but more importantly in the real-time control of such systems. Adaptive and intelligent analytics will be used to produce decision support systems that will drive the ability to increase the variability of both supply and consumption [3]. This in turn paves the way for adaptive pricing incentives and a greater understanding of the water-energy nexus. This integration is complex and uncertain yet being typical of a cyber-physical system, and its relevance transcends the water resource management domain. The WISDOM framework will be modeled and simulated with initial testing at an experimental facility in France (AQUASIM – a full-scale test-bed facility to study sustainable water management), then deployed and evaluated in in two pilots in Cardiff (UK) and La Spezia (Italy). These demonstrators will evaluate the integrated concept providing insight for wider adoption.
Resumo:
This work studies the fabrication of spaghetti through the process at high temperatures through the use of flour added to flour and flaxseed meal, with the aim of evaluating the final product quality and estimate the cost of production. The values of moisture, ash, protein, wet gluten, gluten index, falling number and grain of flour and mixtures to test to be the possible use in mass manufacturing and technological criteria for compliance with current legislation. Spaghetti noodles type were manufactured by adding 10% and 20% flour and 10% and 20% flaxseed meal with performance of physical-chemical, sensory and rheological properties of the products. Further analysis was performed on the product acceptance and estimation of production cost in order to create subsidies to enable the introduction of products with greater acceptance and economic viability in the market by the food industry. On the rheology of the product test was cooking the pasta, specifying the volume increase, cooking time and percentage of solid waste. In the sensory evaluation was carried out the triangular test of product differentiation with 50 trained judges and acceptance testing by a hedonic scale with evaluation of the aspects color, taste, smell and texture. In defining the sensory profile of the product was performed with ADQ 9 judges recruited and trained at the factory, using unstructured scale of 9 cm, assessing the attributes of flavor of wheat, flax flavor, consistency, texture of raw pasta, raw pasta color and color of cooked pasta. The greater acceptance of product quality was good and the pasta with 20% flour, 10% followed by the full product, 10% and 20% flaxseed characterized the average quality of the criterion of loss analysis of solids, together with mass full commercial testing. In assessing the estimated cost of production, the two products more technologically feasible and acceptable (20% whole and 10% flaxseed) were evaluated in high temperature processes. With total cost of R $ 4,872.5 / 1,000 kg and R $ 5,354.9 / 1,000 kg respectively, the difference was related to the addition of lower inputs and higher added value in the market, flour and flaxseed meal. The comparative analysis of cases was confirmed the reduction in production time (10h), more uniform product to the drying process at high temperature compared to conventional
Resumo:
This study aims to assess the potential for industrial reuse of textile wastewater, after passing through a physical and chemical pretreatment, into denim washing wet processing operations in an industrial textile laundry, with no need for complementary treatments and dilutions. The methodology and evaluation of the proposed tests were based on the production techniques used in the company and upgraded for the experiments tested. The characterization of the treated effluent for 16 selected parameters and the development of a monitoring able to tailor the treated effluent for final disposal in accordance with current legislation was essential for the initiation of testing for reuse. The parameters color, turbidity, SS and pH used were satisfactory as control variables and presents simple determination methods. The denim quality variables considered were: color, odor, appearance and soft handle. The tests were started on a pilot scale following complexity factors attributed to the processes, in denim fabric and jeans, which demonstrated the possibility of reuse, because there was no interference in the processes and at quality of the tested product. Industrial scale tests were initiated by a step control that confirmed the methodology efficiency applied to identify the possibility of reuse by tests that precede each recipe to be processed. 556 replicates were performed in production scale for 47 different recipes of denim washing. The percentage of water reuse was 100% for all processes and repetitions performed after the initial adjustment testing phase. All the jeans were framed with the highest quality for internal control and marketed, being accepted by contractors. The full-scale use of treated wastewater, supported by monitoring and evaluation and control methodology suggested in this study, proved to be valid in textile production, not given any negative impact to the quality the produced jeans under the presented conditions. It is believed that this methodology can be extrapolated to other laundries to determine the possibility of reuse in denim washing wet processing with the necessary modifications to each company.
Resumo:
The determination of hydrodynamic coefficients of full scale underwater vehicles using system identification (SI) is an extremely powerful technique. The procedure is based on experimental runs and on the analysis of on-board sensors and thrusters signals. The technique is cost effective and it has high repeatability; however, for open-frame underwater vehicles, it lacks accuracy due to the sensors' noise and the poor modeling of thruster-hull and thruster-thruster interaction effects. In this work, forced oscillation tests were undertaken with a full scale open-frame underwater vehicle. These conducted tests are unique in the sense that there are not many examples in the literature taking advantage of a PMM installation for testing a prototype and; consequently, allowing the comparison between the experimental results and the ones estimated by parameter identification. The Morison's equation inertia and drag coefficients were estimated with two parameter identification methods, that is, the weighted and the ordinary least-squares procedures. It was verified that the in-line force estimated from Morison's equation agrees well with the measured one except in the region around the motion inversion points. On the other hand, the error analysis showed that the ordinary least-squares provided better accuracy and, therefore, was used to evaluate the ratio between inertia and drag forces for a range of Keulegan-Carpenter and Reynolds numbers. It was concluded that, although both experimental and estimation techniques proved to be powerful tools for evaluation of an open-frame underwater vehicle's hydrodynamic coefficients, the research provided a rich amount of reference data for comparison with reduced models as well as for dynamic motion simulation of ROVs. [DOI: 10.1115/1.4004952]
Resumo:
The use of the core-annular flow pattern, where a thin fluid surrounds a very viscous one, has been suggested as an attractive artificial-lift method for heavy oils in the current Brazilian ultra-deepwater production scenario. This paper reports the pressure drop measurements and the core-annular flow observed in a 2 7/8-inch and 300 meter deep pilot-scale well conveying a mixture of heavy crude oil (2000 mPa.s and 950 kg/m3 at 35 C) and water at several combinations of the individual flow rates. The two-phase pressure drop data are compared with those of single-phase oil flow to assess the gains due to water injection. Another issue is the handling of the core-annular flow once it has been established. High-frequency pressure-gradient signals were collected and a treatment based on the Gabor transform together with neural networks is proposed as a promising solution for monitoring and control. The preliminary results are encouraging. The pilot-scale tests, including long-term experiments, were conducted in order to investigate the applicability of using water to transport heavy oils in actual wells. It represents an important step towards the full scale application of the proposed artificial-lift technology. The registered improvements in terms of oil production rate and pressure drop reductions are remarkable.
Resumo:
Slope failure occurs in many areas throughout the world and it becomes an important problem when it interferes with human activity, in which disasters provoke loss of life and property damage. In this research we investigate the slope failure through the centrifuge modeling, where a reduced-scale model, N times smaller than the full-scale (prototype), is used whereas the acceleration is increased by N times (compared with the gravity acceleration) to preserve the stress and the strain behavior. The aims of this research “Centrifuge modeling of sandy slopes” are in extreme synthesis: 1) test the reliability of the centrifuge modeling as a tool to investigate the behavior of a sandy slope failure; 2) understand how the failure mechanism is affected by changing the slope angle and obtain useful information for the design. In order to achieve this scope we arranged the work as follows: Chapter one: centrifuge modeling of slope failure. In this chapter we provide a general view about the context in which we are working on. Basically we explain what is a slope failure, how it happens and which are the tools available to investigate this phenomenon. Afterwards we introduce the technology used to study this topic, that is the geotechnical centrifuge. Chapter two: testing apparatus. In the first section of this chapter we describe all the procedures and facilities used to perform a test in the centrifuge. Then we explain the characteristics of the soil (Nevada sand), like the dry unit weight, water content, relative density, and its strength parameters (c,φ), which have been calculated in laboratory through the triaxial test. Chapter three: centrifuge tests. In this part of the document are presented all the results from the tests done in centrifuge. When we talk about results we refer to the acceleration at failure for each model tested and its failure surface. In our case study we tested models with the same soil and geometric characteristics but different angles. The angles tested in this research were: 60°, 75° and 90°. Chapter four: slope stability analysis. We introduce the features and the concept of the software: ReSSA (2.0). This software allows us to calculate the theoretical failure surfaces of the prototypes. Then we show in this section the comparisons between the experimental failure surfaces of the prototype, traced in the laboratory, and the one calculated by the software. Chapter five: conclusion. The conclusion of the research presents the results obtained in relation to the two main aims, mentioned above.
Resumo:
As land is developed, the impervious surfaces that are created increase the amount of runoff during rainfall events, disrupting the natural hydrologic cycle, with an increment in volume of runoff and in pollutant loadings. Pollutants deposited or derived from an activity on the land surface will likely end up in stormwater runoff in some concentration, such as nutrients, sediment, heavy metals, hydrocarbons, gasoline additives, pathogens, deicers, herbicides and pesticides. Several of these pollutants are particulate-bound, so it appears clear that sediment removal can provide significant water-quality improvements and it appears to be important the knowledge of the ability of stromwater treatment devices to retain particulate matter. For this reason three different units which remove sediments have been tested through laboratory. In particular a roadside gully pot has been tested under steady hydraulic conditions, varying the characteristics of the influent solids (diameter, particle size distribution and specific gravity). The efficiency in terms of particles retained has been evaluated as a function of influent flow rate and particles characteristics; results have been compared to efficiency evaluated applying an overflow rate model. Furthermore the role of particles settling velocity in efficiency determination has been investigated. After the experimental runs on the gully pot, a standard full-scale model of an hydrodynamic separator (HS) has been tested under unsteady influent flow rate condition, and constant solid concentration at the input. The results presented in this study illustrate that particle separation efficiency of the unit is predominately influenced by operating flow rate, which strongly affects the particles and hydraulic residence time of the system. The efficiency data have been compared to results obtained from a modified overflow rate model; moreover the residence time distribution has been experimentally determined through tracer analyses for several steady flow rates. Finally three testing experiments have been performed for two different configurations of a full-scale model of a clarifier (linear and crenulated) under unsteady influent flow rate condition, and constant solid concentration at the input. The results illustrate that particle separation efficiency of the unit is predominately influenced by the configuration of the unit itself. Turbidity measures have been used to compare turbidity with the suspended sediments concentration, in order to find a correlation between these two values, which can allow to have a measure of the sediments concentration simply installing a turbidity probe.
Resumo:
Environmental decay in porous masonry materials, such as brick and mortar, is a widespread problem concerning both new and historic masonry structures. The decay mechanisms are quite complex dependng upon several interconnected parameters and from the interaction with the specific micro-climate. Materials undergo aesthetical and substantial changes in character but while many studies have been carried out, the mechanical aspect has been largely understudied while it bears true importance from the structural viewpoint. A quantitative assessment of the masonry material degradation and how it affects the load-bearing capacity of masonry structures appears missing. The research work carried out, limiting the attention to brick masonry addresses this issue through an experimental laboratory approach via different integrated testing procedures, both non-destructive and mechanical, together with monitoring methods. Attention was focused on transport of moisture and salts and on the damaging effects caused by the crystallization of two different salts, sodium chloride and sodium sulphate. Many series of masonry specimens, very different in size and purposes were used to track the damage process since its beginning and to monitor its evolution over a number of years Athe same time suitable testing techniques, non-destructive, mini-invasive, analytical, of monitoring, were validated for these purposes. The specimens were exposed to different aggressive agents (in terms of type of salt, of brine concentration, of artificial vs. open-air natural ageing, …), tested by different means (qualitative vs. quantitative, non destructive vs. mechanical testing, punctual vs. wide areas, …), and had different size (1-, 2-, 3-header thick walls, full-scale walls vs. small size specimens, brick columns and triplets vs. small walls, masonry specimens vs. single units of brick and mortar prisms, …). Different advanced testing methods and novel monitoring techniques were applied in an integrated holistic approach, for quantitative assessment of masonry health state.
Resumo:
Objectives: Previous research conducted in the late 1980s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over 25years old, the data are no longer representative of the currently installed barriers or the present US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if current full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. Methods: To characterize secondary collisions, 1,363 (596,331 weighted) real-world barrier midsection impacts selected from 13years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS) were analyzed. Scene diagram and available scene photographs were used to determine roadside and barrier specific variables unavailable in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. To investigate current secondary collision crash test criteria, 24 full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from National Cooperative Highway Research Program (NCHRP) Report 350. Results: Secondary collisions were found to occur in approximately two thirds of crashes where a barrier is the first object struck. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors to secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of 7 compared to cases with no second event present. The NCHRP Report 350 exit angle criterion was found to underestimate the risk of secondary collisions in real-world barrier crashes. Conclusions: Consistent with previous research, collisions following a barrier impact are not an infrequent event and substantially increase driver injury risk. The results suggest that using exit-angle based crash test criteria alone to assess secondary collision risk is not sufficient to predict second collision occurrence for real-world barrier crashes.
Resumo:
Modifications and upgrades to the hydraulic flume facility in the Environmental Fluid Mechanics and Hydraulics Laboratory (EFM&H) at Bucknell University are described. These changes enable small-scale testing of model marine hydrokinetic(MHK) devices. The design of the experimental platform provides a controlled environment for testing of model MHK devices to determine their effect on localsubstrate. Specifically, the effects being studied are scour and erosion around a cylindrical support structure and deposition of sediment downstream from the device.
Resumo:
As lightweight and slender structural elements are more frequently used in the design, large scale structures become more flexible and susceptible to excessive vibrations. To ensure the functionality of the structure, dynamic properties of the occupied structure need to be estimated during the design phase. Traditional analysis method models occupants simply as an additional mass; however, research has shown that human occupants could be better modeled as an additional degree-of- freedom. In the United Kingdom, active and passive crowd models are proposed by the Joint Working Group as a result of a series of analytical and experimental research. It is expected that the crowd models would yield a more accurate estimation to the dynamic response of the occupied structure. However, experimental testing recently conducted through a graduate student project at Bucknell University indicated that the proposed passive crowd model might be inaccurate in representing the impact on the structure from the occupants. The objective of this study is to provide an assessment of the validity of the crowd models proposed by JWG through comparing the dynamic properties obtained from experimental testing data and analytical modeling results. The experimental data used in this study was collected by Firman in 2010. The analytical results were obtained by performing a time-history analysis on a finite element model of the occupied structure. The crowd models were created based on the recommendations from the JWG combined with the physical properties of the occupants during the experimental study. During this study, SAP2000 was used to create the finite element models and to implement the analysis; Matlab and ME¿scope were used to obtain the dynamic properties of the structure through processing the time-history analysis results from SAP2000. The result of this study indicates that the active crowd model could quite accurately represent the impact on the structure from occupants standing with bent knees while the passive crowd model could not properly simulate the dynamic response of the structure when occupants were standing straight or sitting on the structure. Future work related to this study involves improving the passive crowd model and evaluating the crowd models with full-scale structure models and operating data.