925 resultados para Compressed workweek.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The acceleration of solid dosage form product development can be facilitated by the inclusion of excipients that exhibit poly-/multi-functionality with reduction of the time invested in multiple excipient optimisations. Because active pharmaceutical ingredients (APIs) and tablet excipients present diverse densification behaviours upon compaction, the involvement of these different powders during compaction makes the compaction process very complicated. The aim of this study was to assess the macrometric characteristics and distribution of surface charges of two powders: indomethacin (IND) and arginine (ARG); and evaluate their impact on the densification properties of the two powders. Response surface modelling (RSM) was employed to predict the effect of two independent variables; Compression pressure (F) and ARG percentage (R) in binary mixtures on the properties of resultant tablets. The study looked at three responses namely; porosity (P), tensile strength (S) and disintegration time (T). Micrometric studies showed that IND had a higher charge density (net charge to mass ratio) when compared to ARG; nonetheless, ARG demonstrated good compaction properties with high plasticity (Y=28.01MPa). Therefore, ARG as filler to IND tablets was associated with better mechanical properties of the tablets (tablet tensile strength (σ) increased from 0.2±0.05N/mm2 to 2.85±0.36N/mm2 upon adding ARG at molar ratio of 8:1 to IND). Moreover, tablets' disintegration time was shortened to reach few seconds in some of the formulations. RSM revealed tablet porosity to be affected by both compression pressure and ARG ratio for IND/ARG physical mixtures (PMs). Conversely, the tensile strength (σ) and disintegration time (T) for the PMs were influenced by the compression pressure, ARG ratio and their interactive term (FR); and a strong correlation was observed between the experimental results and the predicted data for tablet porosity. This work provides clear evidence of the multi-functionality of ARG as filler, binder and disintegrant for directly compressed tablets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 68T50,62H30,62J05.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The police use both subjective (i.e. police staff) and automated (e.g. face recognition systems) methods for the completion of visual tasks (e.g person identification). Image quality for police tasks has been defined as the image usefulness, or image suitability of the visual material to satisfy a visual task. It is not necessarily affected by any artefact that may affect the visual image quality (i.e. decrease fidelity), as long as these artefacts do not affect the relevant useful information for the task. The capture of useful information will be affected by the unconstrained conditions commonly encountered by CCTV systems such as variations in illumination and high compression levels. The main aim of this thesis is to investigate aspects of image quality and video compression that may affect the completion of police visual tasks/applications with respect to CCTV imagery. This is accomplished by investigating 3 specific police areas/tasks utilising: 1) the human visual system (HVS) for a face recognition task, 2) automated face recognition systems, and 3) automated human detection systems. These systems (HVS and automated) were assessed with defined scene content properties, and video compression, i.e. H.264/MPEG-4 AVC. The performance of imaging systems/processes (e.g. subjective investigations, performance of compression algorithms) are affected by scene content properties. No other investigation has been identified that takes into consideration scene content properties to the same extend. Results have shown that the HVS is more sensitive to compression effects in comparison to the automated systems. In automated face recognition systems, `mixed lightness' scenes were the most affected and `low lightness' scenes were the least affected by compression. In contrast the HVS for the face recognition task, `low lightness' scenes were the most affected and `medium lightness' scenes the least affected. For the automated human detection systems, `close distance' and `run approach' are some of the most commonly affected scenes. Findings have the potential to broaden the methods used for testing imaging systems for security applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multimedia Internet KEYing protocol (MIKEY) aims at establishing secure credentials between two communicating entities. However, existing MIKEY modes fail to meet the requirements of low-power and low-processing devices. To address this issue, we combine two previously proposed approaches to introduce a new distributed and compressed MIKEY mode for the Internet of Things. Indeed, relying on a cooperative approach, a set of third parties is used to discharge the constrained nodes from heavy computational operations. Doing so, the preshared mode is used in the constrained part of network, while the public key mode is used in the unconstrained part of the network. Furthermore, to mitigate the communication cost we introduce a new header compression scheme that reduces the size of MIKEY’s header from 12 Bytes to 3 Bytes in the best compression case. Preliminary results show that our proposed mode is energy preserving whereas its security properties are preserved untouched.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The share of variable renewable energy in electricity generation has seen exponential growth during the recent decades, and due to the heightened pursuit of environmental targets, the trend is to continue with increased pace. The two most important resources, wind and insolation both bear the burden of intermittency, creating a need for regulation and posing a threat to grid stability. One possibility to deal with the imbalance between demand and generation is to store electricity temporarily, which was addressed in this thesis by implementing a dynamic model of adiabatic compressed air energy storage (CAES) with Apros dynamic simulation software. Based on literature review, the existing models due to their simplifications were found insufficient for studying transient situations, and despite of its importance, the investigation of part load operation has not yet been possible with satisfactory precision. As a key result of the thesis, the cycle efficiency at design point was simulated to be 58.7%, which correlated well with literature information, and was validated through analytical calculations. The performance at part load was validated against models shown in literature, showing good correlation. By introducing wind resource and electricity demand data to the model, grid operation of CAES was studied. In order to enable the dynamic operation, start-up and shutdown sequences were approximated in dynamic environment, as far as is known, the first time, and a user component for compressor variable guide vanes (VGV) was implemented. Even in the current state, the modularly designed model offers a framework for numerous studies. The validity of the model is limited by the accuracy of VGV correlations at part load, and in addition the implementation of heat losses to the thermal energy storage is necessary to enable longer simulations. More extended use of forecasts is one of the important targets of development, if the system operation is to be optimised in future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of renewable energies as a response to the EU targets defined for 2030 Climate Change and Energy has been increasing. Also non-dispatchable and intermittent renewable energies like wind and solar cannot generally match supply and demand, which can also cause some problems in the grid. So, the increased interest in energy storage has evolved and there is nowadays an urgent need for larger energy storage capacity. Compressed Air Energy Storage (CAES) is a proven technology for storing large quantities of electrical energy in the form of high-pressure air for later use when electricity is needed. It exists since the 1970’s and is one of the few energy storage technologies suitable for long duration (tens of hours) and utility scale (hundreds to thousands of MW) applications. It is also one of the most cost-effective solutions for large to small scale storage applications. Compressed Air Energy Storage can be integrated and bring advantages to different levels of the electric system, from the Generation level, to the Transmission and Distribution levels, so in this paper a revisit of CAES is done in order to better understand what and how it can be used for our modern needs of energy storage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fatigue and overwork are problems experienced by numerous employees in many industry sectors. Focusing on improving work-life balance can frame the ‘problem’ of long work hours to resolve working time duration issues. Flexible work options through re-organising working time arrangements is key to developing an organisational response for delivering work-life balance and usually involves changing the internal structure of work time. This study examines the effect of compressed long weekly working hours and the consequent ‘long break’ on work-life balance. Using Spillover theory and Border theory, this research considers organisational and personal determinants of overwork and fatigue. It concludes compressed long work hours with a long break provide better work-life balance. Further, a long break allows gaining ‘personal time’ and overcoming fatigue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heart disease is attributed as the highest cause of death in the world. Although this could be alleviated by heart transplantation, there is a chronic shortage of donor hearts and so mechanical solutions are being considered. Currently, many Ventricular Assist Devices (VADs) are being developed worldwide in an effort to increase life expectancy and quality of life for end stage heart failure patients. Current pre-clinical testing methods for VADs involve laboratory testing using Mock Circulation Loops (MCLs), and in vivo testing in animal models. The research and development of highly accurate MCLs is vital to the continuous improvement of VAD performance. The first objective of this study was to develop and validate a mathematical model of a MCL. This model could then be used in the design and construction of a variable compliance chamber to improve the performance of an existing MCL as well as form the basis for a new miniaturised MCL. An extensive review of literature was carried out on MCLs and mathematical modelling of their function. A mathematical model of a MCL was then created in the MATLAB/SIMULINK environment. This model included variable features such as resistance, fluid inertia and volumes (resulting from the pipe lengths and diameters); compliance of Windkessel chambers, atria and ventricles; density of both fluid and compressed air applied to the system; gravitational effects on vertical columns of fluid; and accurately modelled actuators controlling the ventricle contraction. This model was then validated using the physical properties and pressure and flow traces produced from a previously developed MCL. A variable compliance chamber was designed to reproduce parameters determined by the mathematical model. The function of the variability was achieved by controlling the transmural pressure across a diaphragm to alter the compliance of the system. An initial prototype was tested in a previously developed MCL, and a variable level of arterial compliance was successfully produced; however, the complete range of compliance values required for accurate physiological representation was not able to be produced with this initial design. The mathematical model was then used to design a smaller physical mock circulation loop, with the tubing sizes adjusted to produce accurate pressure and flow traces whilst having an appropriate frequency response characteristic. The development of the mathematical model greatly assisted the general design of an in vitro cardiovascular device test rig, while the variable compliance chamber allowed simple and real-time manipulation of MCL compliance to allow accurate transition between a variety of physiological conditions. The newly developed MCL produced an accurate design of a mechanical representation of the human circulatory system for in vitro cardiovascular device testing and education purposes. The continued improvement of VAD test rigs is essential if VAD design is to improve, and hence improve quality of life and life expectancy for heart failure patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates aspects of encoding the speech spectrum at low bit rates, with extensions to the effect of such coding on automatic speaker identification. Vector quantization (VQ) is a technique for jointly quantizing a block of samples at once, in order to reduce the bit rate of a coding system. The major drawback in using VQ is the complexity of the encoder. Recent research has indicated the potential applicability of the VQ method to speech when product code vector quantization (PCVQ) techniques are utilized. The focus of this research is the efficient representation, calculation and utilization of the speech model as stored in the PCVQ codebook. In this thesis, several VQ approaches are evaluated, and the efficacy of two training algorithms is compared experimentally. It is then shown that these productcode vector quantization algorithms may be augmented with lossless compression algorithms, thus yielding an improved overall compression rate. An approach using a statistical model for the vector codebook indices for subsequent lossless compression is introduced. This coupling of lossy compression and lossless compression enables further compression gain. It is demonstrated that this approach is able to reduce the bit rate requirement from the current 24 bits per 20 millisecond frame to below 20, using a standard spectral distortion metric for comparison. Several fast-search VQ methods for use in speech spectrum coding have been evaluated. The usefulness of fast-search algorithms is highly dependent upon the source characteristics and, although previous research has been undertaken for coding of images using VQ codebooks trained with the source samples directly, the product-code structured codebooks for speech spectrum quantization place new constraints on the search methodology. The second major focus of the research is an investigation of the effect of lowrate spectral compression methods on the task of automatic speaker identification. The motivation for this aspect of the research arose from a need to simultaneously preserve the speech quality and intelligibility and to provide for machine-based automatic speaker recognition using the compressed speech. This is important because there are several emerging applications of speaker identification where compressed speech is involved. Examples include mobile communications where the speech has been highly compressed, or where a database of speech material has been assembled and stored in compressed form. Although these two application areas have the same objective - that of maximizing the identification rate - the starting points are quite different. On the one hand, the speech material used for training the identification algorithm may or may not be available in compressed form. On the other hand, the new test material on which identification is to be based may only be available in compressed form. Using the spectral parameters which have been stored in compressed form, two main classes of speaker identification algorithm are examined. Some studies have been conducted in the past on bandwidth-limited speaker identification, but the use of short-term spectral compression deserves separate investigation. Combining the major aspects of the research, some important design guidelines for the construction of an identification model when based on the use of compressed speech are put forward.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

n the field of tissue engineering new polymers are needed to fabricate scaffolds with specific properties depending on the targeted tissue. This work aimed at designing and developing a 3D scaffold with variable mechanical strength, fully interconnected porous network, controllable hydrophilicity and degradability. For this, a desktop-robot-based melt-extrusion rapid prototyping technique was applied to a novel tri-block co-polymer, namely poly(ethylene glycol)-block-poly(epsi-caprolactone)-block-poly(DL-lactide), PEG-PCL-P(DL)LA. This co-polymer was melted by electrical heating and directly extruded out using computer-controlled rapid prototyping by means of compressed purified air to build porous scaffolds. Various lay-down patterns (0/30/60/90/120/150°, 0/45/90/135°, 0/60/120° and 0/90°) were produced by using appropriate positioning of the robotic control system. Scanning electron microscopy and micro-computed tomography were used to show that 3D scaffold architectures were honeycomb-like with completely interconnected and controlled channel characteristics. Compression tests were performed and the data obtained agreed well with the typical behavior of a porous material undergoing deformation. Preliminary cell response to the as-fabricated scaffolds has been studied with primary human fibroblasts. The results demonstrated the suitability of the process and the cell biocompatibility of the polymer, two important properties among the many required for effective clinical use and efficient tissue-engineering scaffolding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A composite line source emission (CLSE) model was developed to specifically quantify exposure levels and describe the spatial variability of vehicle emissions in traffic interrupted microenvironments. This model took into account the complexity of vehicle movements in the queue, as well as different emission rates relevant to various driving conditions (cruise, decelerate, idle and accelerate), and it utilised multi-representative segments to capture the accurate emission distribution for real vehicle flow. Hence, this model was able to quickly quantify the time spent in each segment within the considered zone, as well as the composition and position of the requisite segments based on the vehicle fleet information, which not only helped to quantify the enhanced emissions at critical locations, but it also helped to define the emission source distribution of the disrupted steady flow for further dispersion modelling. The model then was applied to estimate particle number emissions at a bi-directional bus station used by diesel and compressed natural gas fuelled buses. It was found that the acceleration distance was of critical importance when estimating particle number emission, since the highest emissions occurred in sections where most of the buses were accelerating and no significant increases were observed at locations where they idled. It was also shown that emissions at the front end of the platform were 43 times greater than at the rear of the platform. Although the CLSE model is intended to be applied in traffic management and transport analysis systems for the evaluation of exposure, as well as the simulation of vehicle emissions in traffic interrupted microenvironments, the bus station model can also be used for the input of initial source definitions in future dispersion models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a result of a broad invitation extended by Professor Martin Betts, Executive Dean of the Faculty of Built Environment and Engineering, to the community of interest at QUT, a cross-disciplinary collaborative workshop was conducted to contribute ideas about responding to the Government of India’s urgent requirement to implement a program to re-house slum dwellers. This is a complex problem facing the Indian Ministry of Housing. Not only does the government aspire to eradicate existing slum conditions and to achieve tangible results within five years, but it must also ensure that slums do not form in the future. The workshop focused on technological innovation in construction to deliver transformation from the current unsanitary and overcrowded informal urban settlements to places that provide the economically weaker sections of Indian society with healthy, environmentally sustainable, economically viable mass housing that supports successful urban living. The workshop was conducted in two part process as follows: Initially, QUT academics from diverse fields shared current research and provided technical background to contextualise the challenge at a pre-workshop briefing session. This was followed by a one-day workshop during which participants worked intensively in multi-disciplinary groups through a series of exercises to develop innovative approaches to the complex problem of slum redevelopment. Dynamic, compressed work sessions, interspersed with cross-functional review and feedback by the whole group took place throughout the day. Reviews emphasised testing the concepts for their level of complexity, and likelihood of success. The two-stage workshop process achieved several objectives:  Inspired a sense of shared purpose amongst a diverse group of academics  Built participants’ knowledge of each other’s capacity  Engaged multi disciplinary team in an innovative design research process  Built participants’ confidence in the collaborative process  Demonstrated that collaborative problem solving can create solutions that represent transformative change.  Developed a framework of how workable solutions might be developed for the program through follow up workshops and charrettes of a similar nature involving stakeholders drawn from the context of the slum housing program management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hayabusa, an unmanned Japanese spacecraft, was launched to study and collect samples from the surface of the asteroid 25143 Itokawa. In June 2010, the Hayabusa spacecraft completed it’s seven year voyage. The spacecraft and the sample return capsule (SRC) re-entered the Earth’s atmosphere over the central Australian desert at speeds on the order of 12 km/s. This provided a rare opportunity to experimentally investigate the radiative heat transfer from the shock-compressed gases in front of the sample return capsule at true-flight conditions. This paper reports on the results of observations from a tracking camera situated on the ground about 100 km from where the capsule experienced peak heating during re-entry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.