909 resultados para Prolonged application times


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Of the numerous factors that play a role in fatal pedestrian collisions, the time of day, day of the week, and time of year can be significant determinants. More than 60% of all pedestrian collisions in 2007 occurred at night, despite the presumed decrease in both pedestrian and automobile exposure during the night. Although this trend is partially explained by factors such as fatigue and alcohol consumption, prior analysis of the Fatality Analysis Reporting System database suggests that pedestrian fatalities increase as light decreases after controlling for other factors. This study applies graphical cross-tabulation, a novel visual assessment approach, to explore the relationships among collision variables. The results reveal that twilight and the first hour of darkness typically observe the greatest frequency of pedestrian fatal collisions. These hours are not necessarily the most risky on a per mile travelled basis, however, because pedestrian volumes are often still high. Additional analysis is needed to quantify the extent to which pedestrian exposure (walking/crossing activity) in these time periods plays a role in pedestrian crash involvement. Weekly patterns of pedestrian fatal collisions vary by time of year due to the seasonal changes in sunset time. In December, collisions are concentrated around twilight and the first hour of darkness throughout the week while, in June, collisions are most heavily concentrated around twilight and the first hours of darkness on Friday and Saturday. Friday and Saturday nights in June may be the most dangerous times for pedestrians. Knowing when pedestrian risk is highest is critically important for formulating effective mitigation strategies and for efficiently investing safety funds. This applied visual approach is a helpful tool for researchers intending to communicate with policy-makers and to identify relationships that can then be tested with more sophisticated statistical tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To investigate the effect of various presbyopic vision corrections on nighttime driving performance on a closed-road driving circuit. Methods. Participants were 11 presbyopes (mean age, 57.3 ± 5.8 years), with a mean best sphere distance refractive error of R+0.23±1.53 DS and L+0.20±1.50 DS, whose only experience of wearing presbyopic vision correction was reading spectacles. The study involved a repeated-measures design by which a participant's nighttime driving performance was assessed on a closed-road circuit while wearing each of four power-matched vision corrections. These included single-vision distance lenses (SV), progressive-addition spectacle lenses (PAL), monovision contact lenses (MV), and multifocal contact lenses (MTF CL) worn in a randomized order. Measures included low-contrast road hazard detection and avoidance, road sign and near target recognition, lane-keeping, driving time, and legibility distance for street signs. Eye movement data (fixation duration and number of fixations) were also recorded. Results. Street sign legibility distances were shorter when wearing MV and MTF CL than SV and PAL (P < 0.001), and participants drove more slowly with MTF CL than with PALs (P = 0.048). Wearing SV resulted in more errors (P < 0.001) and in more (P = 0.002) and longer (P < 0.001) fixations when responding to near targets. Fixation duration was also longer when viewing distant signs with MTF CL than with PAL (P = 0.031). Conclusions. Presbyopic vision corrections worn by naive, unadapted wearers affected nighttime driving. Overall, spectacle corrections (PAL and SV) performed well for distance driving tasks, but SV negatively affected viewing near dashboard targets. MTF CL resulted in the shortest legibility distance for street signs and longer fixation times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Rudd Labour Government rode to power in Australia on the education promise of 'an education revolution'. The term 'education revolution' carries all the obligatory marketing metaphors that an aspirant government might want recognised by the general public on the eve government came to power however in revolutionary terms it fades into insignificance in comparison to the real revolution in Australian education. This revolution simply put is to elevate Indigenous Knowledge Systems, in Australian Universities. In the forty three years since the nation setting Referendum of 1967 a generation has made a beach head on the educational landscape. Now a further generation who having made it into the field of higher degrees yearn for the ways and means to authentically marshal Indigenous knowledge? The Institute of Koorie Education at Deakin has for over twenty years not only witnessed the transition but is also a leader in the field. With the appointment of two Chairs of Indigenous Knowledge Systems to build on to its already established research profile the Institute moved towards what is the 'real revolution' in education – the elevation of Indigenous Knowledge as a legitimate knowledge system. This paper lays out the Institute of Koorie Education‘s Research Plan and the basis of an argument put to the academy that will be the driver for this pursuit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a Genetic Algorithms (GA) approach to resolve traffic conflicts at a railway junction. The formulation of the problem for the suitable application of GA will be discussed and three neighborhoods have been proposed for generation evolution. The performance of the GA is evaluated by computer simulation. This study paves the way for more applications of artificial intelligence techniques on a rather conservative industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User needs and wants dictate the way in which products are designed, produced, used and disposed of. Western society in particular has become very consumer driven and the waste resulting from such activity has the potential to be disastrous. The creation of emotional attachment with possessions is one way of approaching sustainable consumer-product relationships. The aim of this research was to gain a deeper understanding of the interaction and emotional attachment that consumers have and develop with their products. It outlines literature relating to consumer emotion and experience in relation to products, and how pleasurable product user relationships can be prolonged. It is evident from the literature that the roles of materials in the emotional attachment consumers have with products needed to be further explored. A study was conducted to determine consumers. concepts of six materials currently used in product design. This involved participants being given a Concept Prompt Probe with textual prompts to assist in discussion about the materials in question. The discussions between the 15 participant groups of two people, one male and one female, were then transcribed and coded ready for analysis. The study findings demonstrate consumers. concepts of the six materials. The findings show both physical and emotional consumer concepts of the materials. It is, however, the interaction of these concepts that is the most significant finding of this research. Each material concept is not only judged emotionally by consumers in its own right but in relation to other concepts as well. The interaction of the consumers. concepts of materials can considerably effect the emotional judgement made about the material and the appropriateness of its application. This research makes a significant contribution to knowledge regarding the effect materials have on the consumers by identifying how materials can prompt emotional judgements and thereby alter the product user experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces an event-based traffic model for railway systems adopting fixed-block signalling schemes. In this model, the events of trains' arrival at and departure from signalling blocks constitute the states of the traffic flow. A state transition is equivalent to the progress of the trains by one signalling block and it is realised by referring to past and present states, as well as a number of pre-calculated look-up tables of run-times in the signalling block under various signalling conditions. Simulation results are compared with those from a time-based multi-train simulator to study the improvement of processing time and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reactive oxygen species (ROS) and related free radicals are considered to be key factors underpinning the various adverse health effects associated with exposure to ambient particulate matter. Therefore, measurement of ROS is a crucial factor for assessing the potential toxicity of particles. In this work, a novel profluorescent nitroxide, BPEAnit, was investigated as a probe for detecting particle-derived ROS. BPEAnit has a very low fluorescence emission due to inherent quenching by the nitroxide group, but upon radical trapping or redox activity, a strong fluorescence is observed. BPEAnit was tested for detection of ROS present in mainstream and sidestream cigarette smoke. In the case of mainstream cigarette smoke, there was a linear increase in fluorescence intensity with an increasing number of cigarette puffs, equivalent to an average of 101 nmol ROS per cigarette based on the number of moles of the probe reacted. Sidestream cigarette smoke sampled from an environmental chamber exposed BPEAnit to much lower concentrations of particles, but still resulted in a clearly detectible increase in fluorescence intensity with sampling time. It was calculated that the amount of ROS was equivalent to 50 ± 2 nmol per mg of particulate matter; however, this value decreased with ageing of the particles in the chamber. Overall, BPEAnit was shown to provide a sensitive response related to the oxidative capacity of the particulate matter. These findings present a good basis for employing the new BPEAnit probe for the investigation of particle-related ROS generated from cigarette smoke as well as from other combustion sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamics of droplets exhaled from the respiratory system during coughing or talking is addressed. A mathematical model is presented accounting for the motion of a droplet in conjunction with its evaporation. Droplet evaporation and motion are accounted for under two scenarios: 1) A well mixed droplet and 2) A droplet with inner composition variation. A multiple shells model was implemented to account for internal mass and heat transfer and for concentration and temperature gradients inside the droplet. The trajectories of the droplets are computed for a range of conditions and the spatial distribution and residence times of such droplets are evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work was to quantify exposure to particles emitted by wood-fired ovens in pizzerias. Overall, 15 microenvironments were chosen and analyzed in a 14-month experimental campaign. Particle number concentration and distribution were measured simultaneously using a Condensation Particle Counter (CPC), a Scanning Mobility Particle Sizer (SMPS), an Aerodynamic Particle Sizer (APS). The surface area and mass distributions and concentrations, as well as the estimation of lung deposition surface area and PM1 were evaluated using the SMPS-APS system with dosimetric models, by taking into account the presence of aggregates on the basis of the Idealized Aggregate (IA) theory. The fraction of inhaled particles deposited in the respiratory system and different fractions of particulate matter were also measured by means of a Nanoparticle Surface Area Monitor (NSAM) and a photometer (DustTrak DRX), respectively. In this way, supplementary data were obtained during the monitoring of trends inside the pizzerias. We found that surface area and PM1 particle concentrations in pizzerias can be very high, especially when compared to other critical microenvironments, such as the transport hubs. During pizza cooking under normal ventilation conditions, concentrations were found up to 74, 70 and 23 times higher than background levels for number, surface area and PM1, respectively. A key parameter is the oven shape factor, defined as the ratio between the size of the face opening in respect

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the first in a series of four articles which will explore different aspects of air pollution, its impact on health and challenges in defining the boundaries between impact and nonimpact on health. Hardly a new topic one might say. Indeed, it’s been an issue for centuries, millennia even! For example, Pliny the Elder (AD 23-79), a Roman officer and author of the ‘Natural History’ recommended that: “…quarry slaves from asbestos mines not be purchased because they die young”, and suggested: “…the use of a respirator, made of transparent bladder skin, to protect workers from asbestos dust.” Closer to modern times, a Danish Proverb states: "Fresh air impoverishes the doctor". While none of these statements are an air quality guideline in a modern sense, they do illustrate that, for a very long time, we have known that there is a link between air quality and health, and that some measures were taken to reduce the impact of the exposure to the pollutants. Obviously, we are much more sophisticated now!

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This overview focuses on the application of chemometrics techniques for the investigation of soils contaminated by polycyclic aromatic hydrocarbons (PAHs) and metals because these two important and very diverse groups of pollutants are ubiquitous in soils. The salient features of various studies carried out in the micro- and recreational environments of humans, are highlighted in the context of the various multivariate statistical techniques available across discipline boundaries that have been effectively used in soil studies. Particular attention is paid to techniques employed in the geosciences that may be effectively utilized for environmental soil studies; classical multivariate approaches that may be used in isolation or as complementary methods to these are also discussed. Chemometrics techniques widely applied in atmospheric studies for identifying sources of pollutants or for determining the importance of contaminant source contributions to a particular site, have seen little use in soil studies, but may be effectively employed in such investigations. Suitable programs are also available for suggesting mitigating measures in cases of soil contamination, and these are also considered. Specific techniques reviewed include pattern recognition techniques such as Principal Components Analysis (PCA), Fuzzy Clustering (FC) and Cluster Analysis (CA); geostatistical tools include variograms, Geographical Information Systems (GIS), contour mapping and kriging; source identification and contribution estimation methods reviewed include Positive Matrix Factorisation (PMF), and Principal Component Analysis on Absolute Principal Component Scores (PCA/APCS). Mitigating measures to limit or eliminate pollutant sources may be suggested through the use of ranking analysis and multi criteria decision making methods (MCDM). These methods are mainly represented in this review by studies employing the Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and its associated graphic output, Geometrical Analysis for Interactive Aid (GAIA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates how to interface the wireless application protocol (WAP) architecture to the SCADA system running distributed network protocol (DNP) in a power process plant. DNP is a well-developed protocol to be applied in the supervisory control and data acquisition (SCADA) system but the system control centre and remote terminal units (RTUs) are presently connected through a local area network. The conditions in a process plant are harsh and the site is remote. Resources for data communication are difficult to obtain under these conditions, thus, a wireless channel communication through a mobile phone is practical and efficient in a process plant environment. The mobile communication industries and the public have a strong interest in the WAP technology application in mobile phone networks and the WAP application programming interface (API) in power industry applications is one area that requires extensive investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective.To estimate the excess length of stay in an intensive care unit (ICU) due to a central line–associated bloodstream infection (CLABSI), using a multistate model that accounts for the timing of infection. Design.A cohort of 3,560 patients followed up for 36,806 days in ICUs. Setting.Eleven ICUs in 3 Latin American countries: Argentina, Brazil, and Mexico. Patients.All patients admitted to the ICU during a defined time period with a central line in place for more than 24 hours. Results.The average excess length of stay due to a CLABSI increased in 10 of 11 ICUs and varied from −1.23 days to 4.69 days. A reduction in length of stay in Mexico was probably caused by an increased risk of death due to CLABSI, leading to shorter times to death. Adjusting for patient age and Average Severity of Illness Score tended to increase the estimated excess length of stays due to CLABSI. Conclusions.CLABSIs are associated with an excess length of ICU stay. The average excess length of stay varies between ICUs, most likely because of the case‐mix of admissions and differences in the ways that hospitals deal with infections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photo-curable biodegradable macromers were prepared by ring opening polymerization of D,L-lactide (DLLA), (similar to)-caprolactone (CL) and 1,3-trimethylene carbonate (TMC) in the presence of glycerol or sorbitol as initiator and stannous octoate as catalyst, and subsequent methacrylation of the terminal hydroxyl groups. These methacrylated macromers, ranging in molecular weight from approximately 700 to 6000 g/mol, were cross-linked using ultraviolet (UV) light to form biodegradable networks. Homogeneous networks with high gel contents were prepared. One of the resins based on PTMC was used to prepare three-dimensional structures by stereo-lithography using a commercially available apparatus.