109 resultados para Equação de Gross-Pitaevskii
em Queensland University of Technology - ePrints Archive
Resumo:
Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.
Resumo:
Field studies show that the internal screens in a gross pollutant trap (GPT) are often clogged with organic matter, due to infrequent cleaning. The hydrodynamic performance of a GPT with fully blocked screens was comprehensively investigated under a typical range of onsite operating conditions. Using an acoustic Doppler velocimeter (ADV), velocity profiles across three critical sections of the GPT were measured and integrated to examine the net fluid flow at each section. The data revealed that when the screens are fully blocked, the flow structure within the GPT radically changes. Consequently, the capture/retention performance of the device rapidly deteriorates. Good agreement was achieved between the experimental and the previous 2D computational fluid dynamics (CFD) velocity profiles for the lower GPT inlet flow conditions.
Resumo:
A technique was developed to investigate the capture/retention characteristic of a gross pollutant trap (GPT) with fully and partially blocked internal screens. Custom modified spheres of variable density filled with liquid were released into the GPT inlet and monitored at the outlet. The outlet data shows that the capture/retention performances of a GPT with fully blocked screens deteriorate rapidly. During higher flow rates, screen blockages below 68% approach maximum efficiency. At lower flow rates, the high performance trend is reversed and the variation in behaviour of pollutants with different densities becomes more noticeable. Additional experiments with a second upstream inlet configured GPT showed an improved capture/retention performance. It was also noted that the bypass allows the incoming pollutants to escape when the GPT is blocked. This useful feature prevents upstream blockages between cleaning intervals.
Resumo:
This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street waste—gross pollutants—from contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant traps—including patented and registered designs developed by industry—have specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz —which is dependent upon fluid velocities—and this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.
Resumo:
This paper presents a comprehensive review of scientific and grey literature on gross pollutant traps (GPTs). GPTs are designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. Their application involves professional societies, research organisations, local city councils, government agencies and the stormwater industry—often in partnership. In view of this, the 113 references include unpublished manuscripts from these bodies along with scientific peer-reviewed conference papers and journal articles. The literature reviewed was organised into a matrix of six main devices and nine research areas (testing methodologies) which include: design appraisal study, field monitoring/testing, experimental flow fields, gross pollutant capture/retention characteristics, residence time calculations, hydraulic head loss, screen blockages, flow visualisations and computational fluid dynamics (CFD). When the fifty-four item matrix was analysed, twenty-eight research gaps were found in the tabulated literature. It was also found that the number of research gaps increased if only the scientific literature was considered. It is hoped, that in addition to informing the research community at QUT, this literature review will also be of use to other researchers in this field.
Resumo:
A novel and comprehensive testing approach to examine the performance of gross pollutant traps (GPTs) was developed. A proprietary GPT with internal screens for capturing gross pollutants—organic matter and anthropogenic litter—was used as a case study. This work is the first investigation of its kind and provides valuable practical information for the design, selection and operation of GPTs and also the management of street waste in an urban environment. It used a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The results showed that the GPT operated efficiently until at least 68% of the screens were blocked, particularly at high flow rates. At lower flow rates, the high capture/retention performance trend was reversed. It was also found that a raised inlet GPT offered a better capture/retention performance. This finding indicates that cleaning operations could be more effectively planned in conjunction with the deterioration in GPT’s capture/retention performance.
Resumo:
Typical flow fields in a stormwater gross pollutant trap (GPT) with blocked retaining screens were experimentally captured and visualised. Particle image velocimetry (PIV) software was used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. A technique was developed to apply the Image Based Flow Visualization (IBFV) algorithm to the experimental raw dataset generated by the PIV software. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding gross pollutant capture and retention within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate specific areas and identify the flow features within the GPT.
Resumo:
Background It has been proposed that the feral horse foot is a benchmark model for foot health in horses. However, the foot health of feral horses has not been formally investigated. Objectives To investigate the foot health of Australian feral horses and determine if foot health is affected by environmental factors, such as substrate properties and distance travelled. Methods Twenty adult feral horses from five populations (n = 100) were investigated. Populations were selected on the basis of substrate hardness and the amount of travel typical for the population. Feet were radiographed and photographed, and digital images were surveyed by two experienced assessors blinded to each other's assessment and to the population origin. Lamellar samples from 15 feet from three populations were investigated histologically for evidence of laminitis. Results There was a total of 377 gross foot abnormalities identified in 100 left forefeet. There were no abnormalities detected in three of the feet surveyed. Each population had a comparable prevalence of foot abnormalities, although the type and severity of abnormality varied among populations. Of the three populations surveyed by histopathology, the prevalence of chronic laminitis ranged between 40% and 93%. Conclusions Foot health appeared to be affected by the environment inhabited by the horses. The observed chronic laminitis may be attributable to either nutritional or traumatic causes. Given the overwhelming evidence of suboptimal foot health, it may not be appropriate for the feral horse foot to be the benchmark model for equine foot health.
Resumo:
Abstract An experimental dataset representing a typical flow field in a stormwater gross pollutant trap (GPT) was visualised. A technique was developed to apply the image-based flow visualisation (IBFV) algorithm to the raw dataset. Particle image velocimetry (PIV) software was previously used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding stormwater pollutant capture and retention behaviour within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate the possible flow paths of pollutants entering the GPT. The investigated flow paths were compared with the behaviour of pollutants monitored during experiments.
Resumo:
Australia is a difficult market for horror movies. Particularly in recent years, Australia has been regarded as a graveyard for many horror films released theatrically. This is not to say that Australians have not enjoyed the occasional scary movie on the big screen. But what types of horror films have been popular with Australian audiences at the box-office remains poorly understood. Horror films revolve around monsters, the fear of death and the transgression of boundaries, and they aim to scare audiences through ‘gross-out’ or ‘creep-out’ factors (some combine both). The former refers to shocking and graphic portrayals of gore and violence – as seen in the sadistic torture of backpackers in Hostel (Eli Roth, 2005), which depicts limbs being hacked off and eyes being cut from nerve endings. The latter refers to the crafting of fear through mood and suspense without explicit bloodshed, achieved brilliantly in The Sixth Sense’s (M Night Shyamalan, 1999) chilling encounters with ‘dead people’. In creep-out films, it is often what viewers don’t see that is most disturbing. Using an analysis of the top fifty films each year at the Australian box office from 1992 to 2012, this article identifies the most successful horror movies over this period to ascertain what types of horror movies – with reference to creep-out and gross-out factors – have been most popular with domestic audiences.
Resumo:
Gross pollutant traps (GPT) are designed to capture and retain visible street waste, such as anthropogenic litter and organic matter. Blocked screens, low/high downstream tidal waters and flows operating above/below the intended design limits can hamper the operations of a stormwater GPT. Under these adverse operational conditions, a recently developed GPT was evaluated. Capture and retention experiments were conducted on a 50% scale model with partially and fully blocked screens, placed inside a hydraulic flume. Flows were established through the model via an upstream channel-inlet configuration. Floatable, partially buoyant, neutrally buoyant and sinkable spheres were released into the GPT and monitored at the outlet. These experiments were repeated with a pipe-inlet configured GPT. The key findings from the experiments were of practical significance to the design, operation and maintenance of GPTs. These involved an optimum range of screen blockages and a potentially improved inlet design for efficient gross pollutant capture/retention operations. For example, the outlet data showed that the capture and retention efficiency deteriorated rapidly when the screens were fully blocked. The low pressure drop across the retaining screens and the reduced inlet flow velocities were either insufficient to mobilise the gross pollutants, or the GPT became congested.
Resumo:
Discussion of censorship and media freedom in the context of The Interview. A few weeks before the murderous attack by Islamic extremists on the satirical journal Charlie Hebdo, the Hollywood dream factory had its own encounter with would-be censors. The Interview (Evan Goldberg and Seth Rogen, 2014), as everyone with an interest in culture and current affairs cannot fail to be aware of by now, is a comedy in the “grossout” tradition exemplified by commercially successful movies such as Ted (Seth MacFarlane, 2012) and Bridesmaids (Paul Feig, 2011). Their humour is a combination of slapstick, physical comedy, and scatological jokes involving body fluids and the like— hence the “gross”. The best of them have been very funny, as well as bordering on the offensive (see Ted’s scene involving prostitutes, a foul-mouthed teddy bear and the entertainment value of someone taking a dump on the living room floor). They have often been controversial, as in the Farrelly brothers’ Me, Myself and Irene (2000), starring Jim Carrey as a schizophrenic police officer. At their most outrageous they have pushed the boundaries of political correctness to the limit.
Resumo:
The rising problems associated with construction such as decreasing quality and productivity, labour shortages, occupational safety, and inferior working conditions have opened the possibility of more revolutionary solutions within the industry. One prospective option is in the implementation of innovative technologies such as automation and robotics, which has the potential to improve the industry in terms of productivity, safety and quality. The construction work site could, theoretically, be contained in a safer environment, with more efficient execution of the work, greater consistency of the outcome and higher level of control over the production process. By identifying the barriers to construction automation and robotics implementation in construction, and investigating ways in which to overcome them, contributions could be made in terms of better understanding and facilitating, where relevant, greater use of these technologies in the construction industry so as to promote its efficiency. This research aims to ascertain and explain the barriers to construction automation and robotics implementation by exploring and establishing the relationship between characteristics of the construction industry and attributes of existing construction automation and robotics technologies to level of usage and implementation in three selected countries; Japan, Australia and Malaysia. These three countries were chosen as their construction industry characteristics provide contrast in terms of culture, gross domestic product, technology application, organisational structure and labour policies. This research uses a mixed method approach of gathering data, both quantitative and qualitative, by employing a questionnaire survey and an interview schedule; using a wide range of sample from management through to on-site users, working in a range of small (less than AUD0.2million) to large companies (more than AUD500million), and involved in a broad range of business types and construction sectors. Detailed quantitative (statistical) and qualitative (content) data analysis is performed to provide a set of descriptions, relationships, and differences. The statistical tests selected for use include cross-tabulations, bivariate and multivariate analysis for investigating possible relationships between variables; and Kruskal-Wallis and Mann Whitney U test of independent samples for hypothesis testing and inferring the research sample to the construction industry population. Findings and conclusions arising from the research work which include the ranking schemes produced for four key areas of, the construction attributes on level of usage; barrier variables; differing levels of usage between countries; and future trends, have established a number of potential areas that could impact the level of implementation both globally and for individual countries.
Resumo:
It's hard to be dispassionate about Reyner Banham. For me, and for the plethora of other people with strong opinions about Banham, his writing is compelling, and one’s connection to him as a figure quite personal. For me, frankly, he rocks. As a landscape architect, I gleaned most of my knowledge about Modern architecture from Banham. His Theory and Design in the First Machine Age, along with Rowe and Koetter’s Collage City and Venturi’s Complexity and Contradiction in Architecture were the most influential books in my library, by far. Later, as a budding “real scholar”, I was disappointed to find that, while these authors had serious credibility, the writings themselves were regarded as “polemical” – when in fact what I admired about them most was their ability and willingness to make rough groupings and gross generalizations, and to offer fickle opinions. It spoke to me of a real personal engagement and an active, participatory reading of the architectural culture they discussed. They were at their best in their witty, cutting, but generally pithy, creative prose, such as in Rowe’s extrapolation of the modern citizen as the latest “noble savage”, or Banham railing against conservative social advocates and their response to high density housing: “those who had just re-discovered ‘community’ in the slums would fear megastructure as much as any other kind of large-scale renewal program, and would see to it that the people were never ready.” Any reader of Banham will be able to find a gem that will relate, somehow, personally, to what they are doing right now. For Banham, it was all personal, and the gaps in his scholarship, rather, were the dispassionate places: “Such bias is essential – an unbiased historian is a pointless historian – because history is an essentially critical activity, a constant re-scrutiny and rearrangement of the profession.” Reyner Banham: Historian of the Immediate Future, Nigel Whiteley’s recent “intellectual biography” (the MIT Press, 2002), allowed me to revisit Banham’s passionate mode of criticism and to consider what his legacy might be. The book examines Banham’s body of work, grouped according to his various primary fascinations, as well as his relationship to contemporaneous theoretical movements, such as postmodernism. His mode of practice, as a kind of creative critic, is also considered in some depth. While there are points where the book delves into Banham’s personal life, on the whole Whiteley is very rigorous in considering and theorizing the work itself: more than 750 articles and twelve books. In academic terms, this is good practice. However, considering the entirely personal nature of Banham’s writing itself, this separation seems artificial. Banham, as he himself noted, “didn’t mind a gossip”, and often when reading the book I was curious about what was happening to him at the time. Banham’s was an amazing type of intellectual practice, and one that academics (a term he hated) could do well to learn from. While Whiteley spends a lot of time arguing for his practice to be regarded as such, and makes strong points about both the role of the critic, and the importance of journalism, rather than scholarly publishing, I found myself wondering what his study looked like. What books he had in his library. Did he smoke when he wrote? What sort of teaching load did he have? He is an inspiration to design writers and thinkers, and I, personally, wanted to know how he did it.
Resumo:
Synthetic polymers have attracted much attention in tissue engineering due to their ability to modulate biomechanical properties. This study investigated the feasibility of processing poly(varepsilon-caprolactone) (PCL) homopolymer, PCL-poly(ethylene glycol) (PEG) diblock, and PCL-PEG-PCL triblock copolymers into three-dimensional porous scaffolds. Properties of the various polymers were investigated by dynamic thermal analysis. The scaffolds were manufactured using the desktop robot-based rapid prototyping technique. Gross morphology and internal three-dimensional structure of scaffolds were identified by scanning electron microscopy and micro-computed tomography, which showed excellent fusion at the filament junctions, high uniformity, and complete interconnectivity of pore networks. The influences of process parameters on scaffolds' morphological and mechanical characteristics were studied. Data confirmed that the process parameters directly influenced the pore size, porosity, and, consequently, the mechanical properties of the scaffolds. The in vitro cell culture study was performed to investigate the influence of polymer nature and scaffold architecture on the adhesion of the cells onto the scaffolds using rabbit smooth muscle cells. Light, scanning electron, and confocal laser microscopy showed cell adhesion, proliferation, and extracellular matrix formation on the surface as well as inside the structure of both scaffold groups. The completely interconnected and highly regular honeycomb-like pore morphology supported bridging of the pores via cell-to-cell contact as well as production of extracellular matrix at later time points. The results indicated that the incorporation of hydrophilic PEG into hydrophobic PCL enhanced the overall hydrophilicity and cell culture performance of PCL-PEG copolymer. However, the scaffold architecture did not significantly influence the cell culture performance in this study.