989 resultados para sediment bed profiling
Resumo:
A large SAV bed in upper Chesapeake Bay has experienced several abrupt shifts over the past half-century, beginning with near-complete loss after a record-breaking flood in 1972, followed by an unexpected, rapid resurgence in the early 2000’s, then partial decline in 2011 following another major flood event. Together, these trends and events provide a unique opportunity to study a recovering SAV ecosystem from several different perspectives. First, I analyzed and synthesized existing time series datasets to make inferences about what factors prompted the recovery. Next, I analyzed existing datasets, together with field samples and a simple hydrodynamic model to investigate mechanisms of SAV bed loss and resilience to storm events. Finally, I conducted field deployments and experiments to explore how the bed affects internal physical and biogeochemical processes and what implications those effects have for the dynamics of the system. I found that modest reductions in nutrient loading, coupled with several consecutive dry years likely facilitated the SAV resurgence. Furthermore, positive feedback processes may have played a role in the sudden nature of the recovery because they could have reinforced the state of the bed before and after the abrupt shift. I also found that scour and poor water clarity associated with sediment deposition during the 2011 flood event were mechanisms of plant loss. However, interactions between the bed, water flow, and waves served as mechanisms of resilience because these processes created favorable growing conditions (i.e., clear water, low flow velocities) in the inner core of the bed. Finally, I found that that interactions between physical and biogeochemical processes led to low nutrient concentrations inside the bed relative to outside the bed, which created conditions that precluded algal growth and reinforced vascular plant dominance. This work demonstrates that positive feedbacks play a central role in SAV resilience to both chronic eutrophication as well as acute storm events. Furthermore, I show that analysis of long-term ecological monitoring data, together with field measurements and experiments, can be an effective approach for understanding the mechanisms underlying ecosystem dynamics.
Resumo:
Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.
Resumo:
The measurement of ICT (information and communication technology) integration is emerging as an area of research interest with such systems as Education Queensland including it in their recently released list of research priorities. Studies to trial differing integration measurement instruments have taken place within Australia in the last few years, particularly Western Australia (Trinidad, Clarkson, & Newhouse, 2004; Trinidad, Newhouse & Clarkson, 2005), Tasmania (Fitzallen 2005) and Queensland (Finger, Proctor, & Watson, 2005). This paper will add to these investigations by describing an alternate and original methodological approach which was trialled in a small-scale pilot study conducted jointly by Queensland Catholic Education Commission (QCEC) and the Centre of Learning Innovation, Queensland University of Technology (QUT) in late 2005. The methodology described is based on tasks which, through a process of profiling, can be seen to be artefacts which embody the internal and external factors enabling and constraining ICT integration.
Resumo:
Experiments were undertaken to study drying kinetics of moist cylindrical shaped food particulates during fluidised bed drying. Cylindrical particles were prepared from Green beans with three different length:diameter ratios, 3:1, 2:1 and 1:1. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.
Resumo:
Rather than passing judgment of the content of young women’s magazines, it will be argued instead that such texts actually exist as manuals of self-formation, manuals which enroll young women to do specific kinds of work on themselves. In doing so, they form an effective link between the governmental imperatives aimed at constructing particular personas – such as the sexually responsible young girl - and the actual practices whereby these imperatives are operationalised.
Resumo:
Computer forensics is the process of gathering and analysing evidence from computer systems to aid in the investigation of a crime. Typically, such investigations are undertaken by human forensic examiners using purpose-built software to discover evidence from a computer disk. This process is a manual one, and the time it takes for a forensic examiner to conduct such an investigation is proportional to the storage capacity of the computer's disk drives. The heterogeneity and complexity of various data formats stored on modern computer systems compounds the problems posed by the sheer volume of data. The decision to undertake a computer forensic examination of a computer system is a decision to commit significant quantities of a human examiner's time. Where there is no prior knowledge of the information contained on a computer system, this commitment of time and energy occurs with little idea of the potential benefit to the investigation. The key contribution of this research is the design and development of an automated process to describe a computer system and its activity for the purposes of a computer forensic investigation. The term proposed for this process is computer profiling. A model of a computer system and its activity has been developed over the course of this research. Using this model a computer system, which is the subj ect of investigation, can be automatically described in terms useful to a forensic investigator. The computer profiling process IS resilient to attempts to disguise malicious computer activity. This resilience is achieved by detecting inconsistencies in the information used to infer the apparent activity of the computer. The practicality of the computer profiling process has been demonstrated by a proof-of concept software implementation. The model and the prototype implementation utilising the model were tested with data from real computer systems. The resilience of the process to attempts to disguise malicious activity has also been demonstrated with practical experiments conducted with the same prototype software implementation.
Resumo:
Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.
Resumo:
This paper discusses the use of models in automatic computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgements as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the information needed to decide whether manual analysis is required.
Resumo:
The stimulus for this project rose from the need to find an alternative solution to aging superstructures of road-bridge in low volume roads (LVR). The solution investigated, designed and consequently plans to construct, involved replacing an aging super-structure of a 10m span bridge with Flat-Bed Rail Wagon (FBRW). The main focus of this paper is to present alternate structural system for the design of the FBRW as road bridge deck conforming to AS5100. The structural adequacy of the primary members of the FBRW was first validated using full scale experimental investigation to AS5100 serviceability and ultimate limit state loading. The bare FBRW was further developed to include a running surface. Two options were evaluated during the design phase, namely timber and reinforced concrete. First option, which is presented here, involved strengthening of the FBRW using numerous steel sections and overlaying the bridge deck with timber planks. The idea of this approach was to use all the primary and secondary members of the FBRW in load sharing and to provide additional members where weaknesses in the original members arose. The second option, which was the preferred option for construction, involved use of primary members only with an overlaying reinforced concrete slab deck. This option minimised the risk associated with any uncertainty of secondary members to its structural adequacy. The paper will report selected results of the experiment as well as the design phases of option one with conclusions highlighting the viability of option 1 and its limitations.