945 resultados para long memory
Resumo:
This paper describes the optimization of conductor size and the voltage regulator location & magnitude of long rural distribution lines. The optimization minimizes the lifetime cost of the lines, including capital costs and losses while observing voltage drop and operational constraints using a Genetic Algorithm (GA). The GA optimization is applied to a real Single Wire Earth Return (SWER) network in regional Queensland and results are presented.
Resumo:
On the 13th February 2008 Prime Minister Kevin Rudd made an apology to Australia’s Indigenous Peoples on behalf of the Australian Parliament. The State Library of Queensland (SLQ) with assistance from Queensland University of Technology and Queensland’s Aboriginal and Torres Strait Islander communities, has captured responses to this historic event. ‘Responses to the 2008 Apology’ is a collection of digital stories created as part of this research initiative. Until recently, digital storytelling has not generally been treated as a necessary addition to the research collections of Australian libraries. However, libraries increasingly aim to promote new literacies and active audiences as they seek innovative ways to encourage life-long learning by their users, and digital storytelling is one methodology that can contribute to these goals. The State Library of Queensland is the only Australian State Library to have undertaken a major role in the collection of digital stories. They currently lead the way with their Queensland Stories digital storytelling program. This presentation will report findings and outcomes from this research project.
Resumo:
The paper explores the way in which the life of concrete sleepers can be dramatically affected by two important factors, namely impact forces and fatigue cycles. Drawing on the very limited experimental and field data currently available about these two factors, the paper describes detailed simulations of sleepers in a heavy haul track in Queensland Australia over a period of 100 years. The simulation uses real wheel/rail impact force records from that track, together with data on static bending tests of similar sleepers and preliminary information on their impact vs static strength. The simulations suggest that despite successful performance over many decades, large unplanned replacement costs could be imminent, especially considering the increasingly demanding operational conditions sleepers have sustained over their life. The paper also discusses the key factors track owners need to consider in attempting to plan for these developments.
Resumo:
Work time spread across the entire week, rather than the conventional five day working week, has meant that workers are now less able to utilise longer stretches of recreation time especially in gaining access to a full two-day break over a weekend. This paper explores the issues contributing to workers' acquisition of longer recreation time. It seeks to determine the effects of this acquisition on the quality of working and non-working time for the employee through a study of work-life balance in the construction industry. It finds that weekends are more important to achieving work-life balance than shorter days over a six-day week when working long hours. Further, 'personal time' is a key element in achieving satisfactory work-life balance for employees, and this type of 'time' is often forgone in trying to integrate the necessary and desired non-work activities in the shorter time available to workers.
Resumo:
Background Heavy vehicle transportation continues to grow internationally; yet crash rates are high, and the risk of injury and death extends to all road users. The work environment for the heavy vehicle driver poses many challenges; conditions such as scheduling and payment are proposed risk factors for crash, yet the precise measure of these needs quantifying. Other risk factors such as sleep disorders including obstructive sleep apnoea have been shown to increase crash risk in motor vehicle drivers however the risk of heavy vehicle crash from this and related health conditions needs detailed investigation. Methods and Design The proposed case control study will recruit 1034 long distance heavy vehicle drivers: 517 who have crashed and 517 who have not. All participants will be interviewed at length, regarding their driving and crash history, typical workloads, scheduling and payment, trip history over several days, sleep patterns, health, and substance use. All participants will have administered a nasal flow monitor for the detection of obstructive sleep apnoea. Discussion Significant attention has been paid to the enforcement of legislation aiming to deter problems such as excess loading, speeding and substance use; however, there is inconclusive evidence as to the direction and strength of associations of many other postulated risk factors for heavy vehicle crashes. The influence of factors such as remuneration and scheduling on crash risk is unclear; so too the association between sleep apnoea and the risk of heavy vehicle driver crash. Contributory factors such as sleep quality and quantity, body mass and health status will be investigated. Quantifying the measure of effect of these factors on the heavy vehicle driver will inform policy development that aims toward safer driving practices and reduction in heavy vehicle crash; protecting the lives of many on the road network.
Resumo:
The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.
Resumo:
Objective: Because studies of crowding in long-term care settings are lacking, the authors sought to: (1) generate initial estimates of crowding in nursing homes and assisted living facilities; and (2) evaluate two operational approaches to its measurement. ----- ----- Background: Reactions to density and proximity are complex. Greater density intensifies people's reaction to a situation in the direction (positive or negative) that they would react if the situation were to occur under less dense conditions. People with dementia are especially reactive to the environment. ----- ----- Methods: Using a cross-sectional correlational design in nursing homes and assisted living facilities involving 185 participants, multiple observations (N = 6,455) of crowding and other environmental variables were made. Crowding, location, and sound were measured three times per observation; ambiance was measured once. Data analyses consisted of descriptive statistics, t-tests, and one-way analysis of variance. ----- ----- Results: Crowding estimates were higher for nursing homes and in dining and activity rooms. Crowding also varied across settings and locations by time of day. Overall, the interaction of location and time affected crowding significantly (N = 5,559, df [47, 511], F = 105.69, p < .0001); effects were greater within location-by-hour than between location-by-hour, but the effect explained slightly less variance in Long-Term Care Crowding Index (LTC-CI) estimates (47.41%) than location alone. Crowding had small, direct, and highly significant correlations with sound and with the engaging subscale for ambiance; a similar, though inverse, correlation was seen with the soothing subscale for ambiance. ----- ----- Conclusions: Crowding fluctuates consistent with routine activities such as meals in long-term care settings. Furthermore, a relationship between crowding and other physical characteristics of the environment was found. The LTC-CI is likely to be more sensitive than simple people counts when seeking to evaluate the effects of crowding on the behavior of elders-particularly those with dementia-in long-term care settings. aging in place.
Resumo:
The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.
Resumo:
Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.
Resumo:
In a randomized, double-blind study, 202 healthy adults were randomized to receive a live, attenuated Japanese encephalitis chimeric virus vaccine (JE-CV) and placebo 28 days apart in a cross-over design. A subgroup of 98 volunteers received a JE-CV booster at month 6. Safety, immunogenicity, and persistence of antibodies to month 60 were evaluated. There were no unexpected adverse events (AEs) and the incidence of AEs between JE-CV and placebo were similar. There were three serious adverse events (SAE) and no deaths. A moderately severe case of acute viral illness commencing 39 days after placebo administration was the only SAE considered possibly related to immunization. 99% of vaccine recipients achieved a seroprotective antibody titer ≥ 10 to JE-CV 28 days following the single dose of JE-CV, and 97% were seroprotected at month 6. Kaplan Meier analysis showed that after a single dose of JE-CV, 87% of the participants who were seroprotected at month 6 were still protected at month 60. This rate was 96% among those who received a booster immunization at month 6. 95% of subjects developed a neutralizing titer ≥ 10 against at least three of the four strains of a panel of wild-type Japanese encephalitis virus (JEV) strains on day 28 after immunization. At month 60, that proportion was 65% for participants who received a single dose of JE-CV and 75% for the booster group. These results suggest that JE-CV is safe, well tolerated and that a single dose provides long-lasting immunity to wild-type strains
Resumo:
Software transactional memory has the potential to greatly simplify development of concurrent software, by supporting safe composition of concurrent shared-state abstractions. However, STM semantics are defined in terms of low-level reads and writes on individual memory locations, so implementations are unable to take advantage of the properties of user-defined abstractions. Consequently, the performance of transactions over some structures can be disappointing. ----- ----- We present Modular Transactional Memory, our framework which allows programmers to extend STM with concurrency control algorithms tailored to the data structures they use in concurrent programs. We describe our implementation in Concurrent Haskell, and two example structures: a finite map which allows concurrent transactions to operate on disjoint sets of keys, and a non-deterministic channel which supports concurrent sources and sinks. ----- ----- Our approach is based on previous work by others on boosted and open-nested transactions, with one significant development: transactions are given types which denote the concurrency control algorithms they employ. Typed transactions offer a higher level of assurance for programmers reusing transactional code, and allow more flexible abstract concurrency control.
Resumo:
Objective Alcohol-related implicit (preconscious) cognitive processes are established and unique predictors of alcohol use, but most research in this area has focused on alcohol-related implicit cognition and anxiety. This study extends this work into the area of depressed mood by testing a cognitive model that combines traditional explicit (conscious and considered) beliefs, implicit alcohol-related memory associations (AMAs), and self-reported drinking behavior. Method Using a sample of 106 university students, depressed mood was manipulated using a musical mood induction procedure immediately prior to completion of implicit then explicit alcohol-related cognition measures. A bootstrapped two-group (weak/strong expectancies of negative affect and tension reduction) structural equation model was used to examine how mood changes and alcohol-related memory associations varied across groups. Results Expectancies of negative affect moderated the association of depressed mood and AMAs, but there was no such association for tension reduction expectancy. Conclusion Subtle mood changes may unconsciously trigger alcohol-related memories in vulnerable individuals. Results have implications for addressing subtle fluctuations in depressed mood among young adults at risk of alcohol problems.
Resumo:
This full day workshop invites participants to consider the nexus where the interests of game design, the expectations of play and HCI meet: the game interface. Game interfaces seem different to the interface to other software and there have been a number of observations. Shneiderman famously noticed that while most software designers are intent on following the tenets of the “invisible computer” and making access easy for the user, games inter-faces are made for players: they embed challenge. Schell discusses a “strange” relationship between the player and the game enabled by the interface and user interface designers frequently opine that much can be learned from the design of game interfaces. So where does the game interface actually sit? Even more interesting is the question as to whether the history of the relationship and sub-sequent expectations are now limiting the potential of game design as an expressive form. Recent innovations in I/O design such as Nintendo’s Wii, Sony’s Move and Microsoft's Kinect seem to usher in an age of physical player-enabled interaction, experience and embodied, engaged design. This workshop intends to cast light on this often mentioned and sporadically examined area and to establish a platform for new and innovative design in the field.
Resumo:
In the past two decades there has been increasing interest in branding tourism destinations in an effort to meaningfully differentiate against a myriad of competing places that offer similar attractions and facilities. The academic literature relating to destination branding commenced only as recently as 1998, and there remains a dearth of empirical data that tests the effectiveness of brand campaigns, particularly in terms of enhancing destination loyalty. This paper reports the results of an investigation into destination brand loyalty for Australia as a long haul destination in a South American market. In spite of the high level of academic interest in the measurement of perceptions of destinations since the 1970s, few previous studies have examined perceptions held by South American consumers. Drawing on a model of consumer-based brand equity (CBBE), antecedents of destination brand loyalty was tested with data from a large Chilean sample of travelers, comprising a mix of previous visitors and non-visitors to Australia. Findings suggest that destination brand awareness, brand image, and brand value are positively related to brand loyalty for a long-haul destination. However, destination brand quality was not significantly related. The results also indicate that Australia is a more compelling destination brand for previous visitors compared to non-visitors.
Resumo:
This paper investigates virtual reality representations of performance in London’s late sixteenth-century Rose Theatre, a venue that, by means of current technology, can once again challenge perceptions of space, performance, and memory. The VR model of The Rose becomes a Camillo device in that it represents a virtual recreation of this venue in as much detail as possible and attempts to recover graphic demonstrations of the trace memories of the performance modes of the day. The VR model is based on accurate archeological and theatre historical records and is easy to navigate. The introduction of human figures onto The Rose’s stage via motion capture allows us to explore the relationships between space, actor and environment. The combination of venue and actors facilitates a new way of thinking about how the work of early modern playwrights can be stored and recalled. This virtual theatre is thus activated to intersect productively with contemporary studies in performance; as such, our paper provides a perspective on and embodiment of the relation between technology, memory and experience. It is, at its simplest, a useful archiving project for theatrical history, but it is directly relevant to contemporary performance practice as well. Further, it reflects upon how technology and ‘re-enactments’ of sorts mediate the way in which knowledge and experience are transferred, and even what may be considered ‘knowledge.’ Our work provides opportunities to begin addressing what such intermedial confrontations might produce for ‘remembering, experiencing, thinking and imagining.’ We contend that these confrontations will enhance live theatre performance rather than impeding or disrupting contemporary performance practice. This paper intersects with the CFP’s ‘Performing Memory’ and ‘Memory Lab’ themes. Our presentation (which includes a demonstration of the VR model and the motion capture it requires) takes the form of two closely linked papers that share a single abstract. The two papers will be given by two people, one of whom will be physically present in Utrecht, the other participating via Skype.