921 resultados para Cutting speeds
Resumo:
Introduction: An observer, looking sideways from a moving vehicle, while wearing a neutral density filter over one eye, can have a distorted perception of speed, known as the Enright phenomenon. The purpose of this study was to determine how the Enright phenomenon influences driving behaviour. Methods: A geometric model of the Enright phenomenon was developed. Ten young, visually normal, participants (mean age = 25.4 years) were tested on a straight section of a closed driving circuit and instructed to look out of the right side of the vehicle and drive at either 40 Km/h or 60 Km/h under the following binocular viewing conditions: with a 0.9 ND filter over the left eye (leading eye); 0.9 ND filter over the right eye (trailing eye); 0.9 ND filters over both eyes, and with no filters over either eye. The order of filter conditions was randomised and the speed driven recorded for each condition. Results: Speed judgments did not differ significantly between the two baseline conditions (no filters and both eyes filtered) for either speed tested. For the baseline conditions, when subjects were asked to drive at 60 Km/h they matched this speed well (61 ± 10.2 Km/h) but drove significantly faster than requested (51.6 ± 9.4 Km/h) when asked to drive at 40 Km/h. Subjects significantly exceeded baseline speeds by 8.7± 5.0 Km/h, when the trailing eye was filtered and travelled slower than baseline speeds by 3.7± 4.6 Km/h when the leading eye was filtered. Conclusions: This is the first quantitative study demonstrating how the Enright effect can influence perceptions of driving speed, and demonstrates that monocular filtering of an eye can significantly impact driving speeds, albeit to a lesser extent than predicted by geometric models of the phenomenon.
Resumo:
Background: Few studies have specifically investigated the functional effects of uncorrected astigmatism on measures of reading fluency. This information is important to provide evidence for the development of clinical guidelines for the correction of astigmatism. Methods: Participants included 30 visually normal, young adults (mean age 21.7 ± 3.4 years). Distance and near visual acuity and reading fluency were assessed with optimal spectacle correction (baseline) and for two levels of astigmatism, 1.00DC and 2.00DC, at two axes (90° and 180°) to induce both against-the-rule (ATR) and with-the-rule (WTR) astigmatism. Reading and eye movement fluency were assessed using standardized clinical measures including the test of Discrete Reading Rate (DRR), the Developmental Eye Movement (DEM) test and by recording eye movement patterns with the Visagraph (III) during reading for comprehension. Results: Both distance and near acuity were significantly decreased compared to baseline for all of the astigmatic lens conditions (p < 0.001). Reading speed with the DRR for N16 print size was significantly reduced for the 2.00DC ATR condition (a reduction of 10%), while for smaller text sizes reading speed was reduced by up to 24% for the 1.00DC ATR and 2.00DC condition in both axis directions (p<0.05). For the DEM, sub-test completion speeds were significantly impaired, with the 2.00DC condition affecting both vertical and horizontal times and the 1.00DC ATR condition affecting only horizontal times (p<0.05). Visagraph reading eye movements were not significantly affected by the induced astigmatism. Conclusions: Induced astigmatism impaired performance on selected tests of reading fluency, with ATR astigmatism having significantly greater effects on performance than did WTR, even for relatively small amounts of astigmatic blur of 1.00DC. These findings have implications for the minimal prescribing criteria for astigmatic refractive errors.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however the undesirable losses and waste rate that occur during peeling stage are always the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical method is the most preferred; this method keeps edible portions of produce fresh and creates less damage. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry which needs more study on technological aspects of this industrial segment. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. In the proposed study a nonlinear model which will be capable of simulating the peeling process specifically, will be developed. It is expected that unavailable information such as cutting force, maximum shearing force, shear strength, tensile strength and rupture stress will be quantified using the new FEA model. The outcomes will be used to optimize and improve the current mechanical peeling methods of this class of vegetables and thereby enhance the overall effectiveness of processing operations. Presented paper aims to review available literature and previous works have been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
Curation of a fashion parade (Exposed) of QUT student swimwear designs held in conjunction with the ‘Woollen Mermaids’ (history of swimwear) exhibition at QLD Museum. The research explored the exhibition of ‘cutting edge’ swimwear produced with non-traditional fabrics (wool) and experimented with display /presentation styles for fashion parades in museum settings. The paid ticketed event was attended by over 800 people.
Resumo:
For many people, a relatively large proportion of daily exposure to a multitude of pollutants may occur inside an automobile. A key determinant of exposure is the amount of outdoor air entering the cabin (i.e. air change or flow rate). We have quantified this parameter in six passenger vehicles ranging in age from 18 years to <1 year, at three vehicle speeds and under four different ventilation settings. Average infiltration into the cabin with all operable air entry pathways closed was between 1 and 33.1 air changes per hour (ACH) at a vehicle speed of 60 km/h, and between 2.6 and 47.3 ACH at 110 km/h, with these results representing the most (2005 Volkswagen Golf) and least air-tight (1989 Mazda 121) vehicles, respectively. Average infiltration into stationary vehicles parked outdoors varied between ~0 and 1.4 ACH and was moderately related to wind speed. Measurements were also performed under an air recirculation setting with low fan speed, while airflow rate measurements were conducted under two non-recirculate ventilation settings with low and high fan speeds. The windows were closed in all cases, and over 200 measurements were performed. The results can be applied to estimate pollutant exposure inside vehicles.
Resumo:
A single subject longevity study is presented as a case study for the Medical Device Partnering Program (MDPP). The MDPP supports the development of cutting-edge medical devices and assistive technologies, through unique collaborations between researchers, industry, clinical end-users and government. The study aimed to identify what effect the innersole has on specific muscles that may influence stability and whether the innersole had any influence on gait. Three tests were conducted; a standard gait test, dynamic balance test and a standing balance test. Results from the kinematic analysis showed reduced variability in post testing results when compared to pre testing results. Reductions in muscle activation levels were also found across all tests. Further testing with a larger sample size is required to determine if these effects are due to the innersole.
Resumo:
High Speed Rail (HSR) is rapidly gaining popularity worldwide as a safe and efficient transport option for long-distance travel. Designed to win market shares from air transport, HSR systems optimise their productivity between increasing speeds and station spacing to offer high quality service and gain ridership. Recent studies have investigated the effects that the deployment of HSR infrastructure has on spatial distribution and the economic development of cities and regions. Findings appear mostly positive at higher geographical scales, where HSR links connect major urban centres several hundred kilometres apart and already well positioned within a national or international context. Also, at the urban level, studies have shown regeneration and concentration effects around HSR station areas with positive returns on city’s image and economy. However, doubts persist on the effects of HSR at an intermediate scale, where the accessibility trade off on station spacing limits access to many small and medium agglomerations. Thereby, their ability to participate in the development opportunities facilitated by HSR infrastructure is significantly reduced. The locational advantages deriving from transport improvements appear contrasting especially in regions that tend to have a polycentric structure, where cities may present greater accessibility disparities between those served by HSR and those left behind. This thesis fits in this context where intermediate and regional cities do not directly enjoy the presence of an HSR station while having an existing or planned proximate HSR corridor. With the aim of understanding whether there might be a solution to this apparent incongruity, the research investigates strategies to integrate HSR accessibility at the regional level. While current literature recommends to commit with ancillary investments to the uplift of station areas and the renewal of feeder systems, I hypothesised the interoperability between the HSR and the conventional networks to explore the possibilities offered by mixed traffic and infrastructure sharing. Thus, I developed a methodology to quantify the exchange of benefits deriving from this synergistic interaction. In this way, it was possible to understand which level of service quality offered by alternative transit strategies best facilitates the distribution of accessibility benefits for areas far from actual HSR stations. Therefore, strategies were selected for their type of service capable of regional extensions and urban penetrations, while incorporating a combination of specific advantages (e.g. speed, sub-urbanity, capacity, frequency and automation) in order to emulate HSR quality with increasingly efficient services. The North-eastern Italian macro region was selected as case study to ground the research offering concurrently a peripheral polycentric metropolitan form, the presence of a planned HSR corridor with some portions of HSR infrastructure implementation, and the project to develop a suburban rail service extended regionally. Results show significant distributive potential, in terms of network effects produced in relation with HSR, in increasing proportions for all the strategies considered: a regional metro rail strategy (abbreviated RMR), a regional high speed rail strategy (abbreviated RHSR), a regional light rail transit (abbreviated LRT) strategy, and a non-stopping continuous railway system (abbreviated CRS) strategy. The provision of additional tools to value HSR infrastructure against its accessibility benefits and their regional distribution through alternative strategies beyond the actual HSR stations, would have great implications, both politically and technically, in moving towards new dimensions of HSR evaluation and development.
Resumo:
The National Road Safety Strategy 2011-2020 outlines plans to reduce the burden of road trauma via improvements and interventions relating to safe roads, safe speeds, safe vehicles, and safe people. It also highlights that a key aspect in achieving these goals is the availability of comprehensive data on the issue. The use of data is essential so that more in-depth epidemiologic studies of risk can be conducted as well as to allow effective evaluation of road safety interventions and programs. Before utilising data to evaluate the efficacy of prevention programs it is important for a systematic evaluation of the quality of underlying data sources to be undertaken to ensure any trends which are identified reflect true estimates rather than spurious data effects. However, there has been little scientific work specifically focused on establishing core data quality characteristics pertinent to the road safety field and limited work undertaken to develop methods for evaluating data sources according to these core characteristics. There are a variety of data sources in which traffic-related incidents and resulting injuries are recorded, which are collected for a variety of defined purposes. These include police reports, transport safety databases, emergency department data, hospital morbidity data and mortality data to name a few. However, as these data are collected for specific purposes, each of these data sources suffers from some limitations when seeking to gain a complete picture of the problem. Limitations of current data sources include: delays in data being available, lack of accurate and/or specific location information, and an underreporting of crashes involving particular road user groups such as cyclists. This paper proposes core data quality characteristics that could be used to systematically assess road crash data sources to provide a standardised approach for evaluating data quality in the road safety field. The potential for data linkage to qualitatively and quantitatively improve the quality and comprehensiveness of road crash data is also discussed.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. Despite the extent of recent works done on port navigational safety research, little is known about harbor pilot’s perception of collision risks in port fairways. This paper uses a hierarchical ordered probit model to investigate associations between perceived risks and the geometric and traffic characteristics of fairways and the pilot attributes. Perceived risk data, collected through a risk perception survey conducted among the Singapore port pilots, are used to calibrate the model. Intra-class correlation coefficient justifies use of the hierarchical model in comparison with an ordinary model. Results show higher perceived risks in fairways attached to anchorages, and in those featuring sharper bends and higher traffic operating speeds. Lesser risks are perceived in fairways attached to shoreline and confined waters, and in those with one-way traffic, traffic separation scheme, cardinal marks and isolated danger marks. Risk is also found to be perceived higher in night.
Resumo:
The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.
Resumo:
With increasing rate of shipping traffic, the risk of collisions in busy and congested port waters is expected to rise. However, due to low collision frequencies it is difficult to analyze such risk in a sound statistical manner. This study aims at examining the occurrence of traffic conflicts in order to understand the characteristics of vessels involved in navigational hazards. A binomial logit model was employed to evaluate the association of vessel attributes and the kinematic conditions with conflict severity levels. Results show a positive association for vessels of small gross tonnage, overall vessel length, vessel height and draft with conflict risk. Conflicts involving a pair of dynamic vessels sailing at low speeds also have similar effects.
Resumo:
Complexity is a major concern which is aimed to be overcome by people through modeling. One way of reducing complexity is separation of concerns, e.g. separation of business process from applications. One sort of concerns are cross-cutting concerns i.e. concerns which are scattered and tangled through one of several models. In business process management, examples of such concerns are security and privacy policies. To deal with these cross-cutting concerns, the aspect orientated approach was introduced in the software development area and recently also in the business process management area. The work presented in this paper elaborates on aspect oriented process modelling. It extends earlier work by defining a mechanism for capturing multiple concerns and specifying a precedence order according to which they should be handled in a process. A formal syntax of the notation is presented precisely capturing the extended concepts and mechanisms. Finally, the relevant of the approach is demonstrated through a case study.
Resumo:
The Routledge Handbook of Critical Criminology is a collection of original essays specifically designed to offer students, faculty, policy makers, and others an in-depth overview of the most up-to-date empirical, theoretical, and political contributions made by critical criminologists around the world. Special attention is devoted to new theoretical directions in the field, such as cultural criminology, masculinities studies, and feminist criminologies. Its diverse essays not only cover the history of critical criminology and cutting edge theories, but also the variety of research methods used by leading scholars in the field and the rich data generated by their rigorous empirical work. In addition, some of the chapters suggest innovative and realistic short- and long-term policy proposals that are typically ignored by mainstream criminology. These progressive strategies address some of the most pressing social problems facing contemporary society today, and that generate much pain and suffering for socially and economically disenfranchised people. The Handbook explores up-to-date empirical, theoretical, and political contributions, and is specifically designed to be a comprehensive resource for undergraduate and post-graduate students, researchers, and policy makers.
Resumo:
In recent decades the debate among scholars, lawyers, politicians and others about how societies deal with their past has been constant and intensive. 'Legal Institutions and Collective Memories' situates the processes of transitional justice at the intersection between legal procedures and the production of collective and shared meanings of the past. Building upon the work of Maurice Halbwachs, this collection of essays emphasises the extended role and active involvement of contemporary law and legal institutions in public discourse about the past, and explores their impact on the shape that collective memories take in the course of time. The authors uncover a complex pattern of searching for truth, negotiating the past and cultivating the art of forgetting. Their contributions explore the ambiguous and intricate links between the production of justice, truth and memory. The essays cover a broad range of legal institutions, countries and topics. These include transitional trials as 'monumental spectacles' as well as constitutional courts, and the restitution of property rights in Central and Eastern Europe and Australia. The authors explore the biographies of victims and how their voices were repressed, as in the case of Korean Comfort Women. They explore the role of law and legal institutions in linking individual and collective memories in the transitional period through processes of lustration, and they analyse divided memories about the past and their impact on future reconciliation in South Africa. The collection offers a genuinely comparative approach, allied to cutting-edge theory.
Resumo:
Issues in Green Criminology: confronting harms against environments, humanity and other animals aims to provide, if not a manifesto, then at least a significant resource for thinking about green criminology, a rapidly developing field. It offers a set of specially written introductions and a variety of current and new directions, wide-ranging in scope and international in terms of coverage and contributors. It provides focused discussions of current and cutting edge issues that will influence the emergence of a coherent perspective on green issues. The contributors are drawn from the leading thinkers in the field. The twelve chapters of the book explore the myriad ways in which governments, transnational corporations, military apparatuses and ordinary people going about their everyday lives routinely harm environments, other animals and humanity. The book will be essential reading not only for students taking courses in colleges and universities but also for activists in the environmental and animal rights movements. Its concern is with an ever-expanding agenda - the whys, the hows and the whens of the generation and control of the many aspects of harm to environments, ecological systems and all species of animals, including humans. These harms include, but are not limited to, exploitation, modes of discrimination and disempowerment, degradation, abuse, exclusion, pain, injury, loss and suffering. Straddling and intersecting these many forms of harm are key concepts for a green criminology such as gender inequalities, racism, dominionism and speciesism, classism, the north/south divide, the accountability of science, and the ethics of global capitalist expansion. Green criminology has the potential to provide not only a different way of examining and making sense of various forms of crime and control responses (some well known, others less so) but can also make explicable much wider connections that are not generally well understood. As all societies face up to the need to confront harms against environments, other animals and humanity, criminology will have a major role to play. This book will be an essential part of this process.