124 resultados para Farmers, Part-time
Resumo:
The GameFlow model strives to be a general model of player enjoyment, applicable to all game genres and platforms. Derived from a general set of heuristics for creating enjoyable player experiences, the GameFlow model has been widely used in evaluating many types of games, as well as non-game applications. However, we recognize that more specific, low-level, and implementable criteria are potentially more useful for designing and evaluating video games. Consequently, the research reported in this paper aims to provide detailed heuristics for designing and evaluating one specific game genre, real-time strategy games. In order to develop these heuristics, we conducted a grounded theoretical analysis on a set of professional game reviews and structured the resulting heuristics using the GameFlow model. The resulting 165 heuristics for designing and evaluating real-time strategy games are presented and discussed in this paper.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
The legal framework that operates at the end of life in Australia needs to be reformed. • Voluntary euthanasia and assisted suicide are currently unlawful. • Both activities nevertheless occur not infrequently in Australia, in part because palliative care cannot relieve physical and psychological pain and suffering in all cases. • In this respect, the law is deficient. The law is also unfair because it doesn’t treat people equally. Some people can be helped to die on their own terms as a result of their knowledge and/or connections while some are able to hasten their death by the refusal of life-sustaining treatment. But others do not have access to the means for their life to end. • A very substantial majority of Australians have repeatedly expressed in public opinion polls their desire for law reform on these matters. Many are concerned at what they see is happening to their loved ones as they reach the end of their lives, and want the confidence that when their time comes they will be able to exercise choice in relation to assisted dying. • The most consistent reason advanced not to change the law is the need to protect the vulnerable. There is a concern that if the law allows voluntary euthanasia and assisted suicide for some people, it will be expanded and abused, including pressures being placed on highly dependent people and those with disabilities to agree to euthanasia. • But there is now a large body of experience in a number of international jurisdictions following the legalisation of voluntary euthanasia and/or assisted suicide. This shows that appropriate safeguards can be implemented to protect vulnerable people and prevent the abuse that opponents of assisted dying have feared. It reveals that assisted dying meets a real need among a small minority of people at the end of their lives. It also provides reassurance to people with terminal and incurable disease that they will not be left to suffer the indignities and discomfort of a nasty death. • Australia is an increasingly secular society. Strong opposition to assisted death by religious groups that is based on their belief in divine sanctity of all human life is not a justification for denying choice for those who do not share that belief. • It is now time for Australian legislators to respond to this concern and this experience by legislating to enhance the quality of death for those Australians who seek assisted dying.
Resumo:
Purpose: The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone for the assessment of osteoporosis follows a parabolic-type dependence with bone volume fraction; having minima values corresponding to both entire bone and entire marrow. Langton has recently proposed that the primary BUA mechanism may be significant phase interference due to variations in propagation transit time through the test sample as detected over the phase-sensitive surface of the receive ultrasound transducer. This fundamentally simple concept assumes that the propagation of ultrasound through a complex solid : liquid composite sample such as cancellous bone may be considered by an array of parallel ‘sonic rays’. The transit time of each ray is defined by the proportion of bone and marrow propagated, being a minimum (tmin) solely through bone and a maximum (tmax) solely through marrow. A Transit Time Spectrum (TTS), ranging from tmin to tmax, may be defined describing the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit time over the surface of the receive ultrasound transducer. Phase interference may result from interaction of ‘sonic rays’ of differing transit times. The aim of this study was to test the hypothesis that there is a dependence of phase interference upon the lateral inhomogenity of transit time by comparing experimental measurements and computer simulation predictions of ultrasound propagation through a range of relatively simplistic solid:liquid models exhibiting a range of lateral inhomogeneities. Methods: A range of test models was manufactured using acrylic and water as surrogates for bone and marrow respectively. The models varied in thickness in one dimension normal to the direction of propagation, hence exhibiting a range of transit time lateral inhomogeneities, ranging from minimal (single transit time) to maximal (wedge; ultimately the limiting case where each sonic ray has a unique transit time). For the experimental component of the study, two unfocused 1 MHz ¾” broadband diameter transducers were utilized in transmission mode; ultrasound signals were recorded for each of the models. The computer simulation was performed with Matlab, where the transit time and relative amplitude of each sonic ray was calculated. The transit time for each sonic ray was defined as the sum of transit times through acrylic and water components. The relative amplitude considered the reception area for each sonic ray along with absorption in the acrylic. To replicate phase-sensitive detection, all sonic rays were summed and the output signal plotted in comparison with the experimentally derived output signal. Results: From qualtitative and quantitative comparison of the experimental and computer simulation results, there is an extremely high degree of agreement of 94.2% to 99.0% between the two approaches, supporting the concept that propagation of an ultrasound wave, for the models considered, may be approximated by a parallel sonic ray model where the transit time of each ray is defined by the proportion of ‘bone’ and ‘marrow’. Conclusions: This combined experimental and computer simulation study has successfully demonstrated that lateral inhomogeneity of transit time has significant potential for phase interference to occur if a phase-sensitive ultrasound receive transducer is implemented as in most commercial ultrasound bone analysis devices.
Resumo:
The statutory demand procedure has been a part of our corporate law from its earliest modern formulations and it has been suggested, albeit anecdotally, that under the current regime, it gives rise to more litigation than any other part of the Corporations Act. Despite this there has been a lack of consideration of the underlying policy behind the procedure in both the case law and literature; both of which are largely centred on the technical aspects of the process. The purpose of this article is to examine briefly the process of the statutory demand in the context of the current insolvency law in Australia.
Resumo:
The application of different EMS current thresholds on muscle activates not only the muscle but also peripheral sensory axons that send proprioceptive and pain signals to the cerebral cortex. A 32-channel time-domain fNIRS instrument was employed to map regional cortical activities under varied EMS current intensities applied on the right wrist extensor muscle. Eight healthy volunteers underwent four EMS at different current thresholds based on their individual maximal tolerated intensity (MTI), i.e., 10 % < 50 % < 100 % < over 100 % MTI. Time courses of the absolute oxygenated and deoxygenated hemoglobin concentrations primarily over the bilateral sensorimotor cortical (SMC) regions were extrapolated, and cortical activation maps were determined by general linear model using the NIRS-SPM software. The stimulation-induced wrist extension paradigm significantly increased activation of the contralateral SMC region according to the EMS intensities, while the ipsilateral SMC region showed no significant changes. This could be due in part to a nociceptive response to the higher EMS current intensities and result also from increased sensorimotor integration in these cortical regions.
Resumo:
The findings presented in this paper are part of a research project designed to provide a preliminary indication of the support needs of postdiagnosis women with breast cancer in remote and isolated areas in Queensland. This discussion will present data that focuses on the women’s expressed personal concerns. For participants in this research a diagnosis of breast cancer involves a confrontation with their own mortality and the possibility of a reduced life span. This is a definite life crisis, creating shock and needing considerable adjustment. Along with these generic issues the participants also articulated significant issues in relation to their experience as women in a rural setting. These concerns centred around worries about how their partner and families cope during their absences for treatment, the additional burden on the family of having to cope with running the property or farm during the participant’s absence or illness, added financial strain brought about by the cost of travel for treatment, maintenance of properties during absences, and problems created by time off from properties or self-employment. These findings accord with other reports of health and welfare services for rural Australian and the generic literature on psycho-oncology studies of breast cancer.
Resumo:
This paper presents the recent findings from a study on the postdiagnosis support needs of women with breast cancer living in rural and remote Queensland. The findings presented in this discussion focus on support needs from the perspective of the women experiencing breast cancer as well as health service providers. The tyranny of distance imposes unique hardships, such as separation from family and friends, during a time of great vulnerability for treatment, the need to travel long distances for support and follow-up services, and extra financial burdens, which can combine to cause strains on the marital relationship and family cohesion. Positive indications are, however, that the rural communities operate on strong, informal networks of support. This network of family, friends and community can, and does, play an active role in the provision of emotional and practical support.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.
Resumo:
This paper presents the results from a study of information behaviors, with specific focus on information organisation-related behaviours conducted as part of a larger daily diary study with 34 participants. The findings indicate that organization of information in everyday life is a problematic area due to various factors. The self-evident one is the inter-subjectivity between the person who may have organized the information and the person looking for that same information (Berlin et. al., 1993). Increasingly though, we are not just looking for information within collections that have been designed by someone else, but within our own personal collections of information, which frequently include books, electronic files, photos, records, documents, desktops, web bookmarks, and portable devices. The passage of time between when we categorized or classified the information, and the time when we look for the same information, poses several problems of intra-subjectivity, or the difference between our own past and present perceptions of the same information. Information searching, and hence the retrieval of information from one's own collection of information in everyday life involved a spatial and temporal coordination with one's own past selves in a sort of cognitive and affective time travel, just as organizing information is a form of anticipatory coordination with one's future information needs. This has implications for finding information and also on personal information management.
Resumo:
Exposure to ultrafine particles (UFPs) is deemed to be a major risk affecting human health. Therefore, airborne particle studies were performed in the recent years to evaluate the most critical micro-environments, as well as identifying the main UFP sources. Nonetheless, in order to properly evaluate the UFP exposure, personal monitoring is required as the only way to relate particle exposure levels to the activities performed and micro-environments visited. To this purpose, in the present work, the results of experimental analysis aimed at showing the effect of the time-activity patterns on UFP personal exposure are reported. In particular, 24 non-smoking couples (12 during winter and summer time, respectively), comprised of a man who worked full-time and a woman who was a homemaker, were analyzed using personal particle counter and GPS monitors. Each couple was investigated for a 48-h period, during which they also filled out a diary reporting the daily activities performed. Time activity patterns, particle number concentration exposure and the related dose received by the participants, in terms of particle alveolar-deposited surface area, were measured. The average exposure to particle number concentration was higher for women during both summer and winter (Summer: women 1.8×104 part. cm-3; men 9.2×103 part. cm-3; Winter: women 2.9×104 part. cm-3; men 1.3×104 part. cm-3), which was likely due to the time spent undertaking cooking activities. Staying indoors after cooking also led to higher alveolar-deposited surface area dose for both women and men during the winter time (9.12×102 and 6.33×102 mm2, respectively), when indoor ventilation was greatly reduced. The effect of cooking activities was also detected in terms of women’s dose intensity (dose per unit time), being 8.6 and 6.6 in winter and summer, respectively. On the contrary, the highest dose intensity activity for men was time spent using transportation (2.8 in both winter and summer).
Resumo:
“Mental illness is a tough illness to survive, it is incurable but manageable. Living with the illness when at its full potency can disrupt your life at any moment.” Intensive care for patients experiencing acute psychiatric distress is an essential yet complex part of mental health services as a whole system. Psychiatric intensive care units remain a source of controversy; despite promising developments to health services incorporating recovery goals and processes outlined by people with a mental illness themselves. In past decades changes in the provision of mental health services have focused on the restoration of a meaningful and empowered life with choice and hope as a defining attribute of recovery. Yet, what does recovery mean and how are recovery principles accomplished in psychiatric intensive care arrangements for someone experiencing acute psychiatric distress?
Resumo:
This paper discusses findings made during a study of energy use feedback in the home (eco-feedback), well after the novelty has worn off. Contributing towards four important knowledge gaps in the research, we explore eco-feedback over longer time scales, focusing on instances where the feedback was not of lasting benefit to users rather than when it was. Drawing from 23 semi-structured interviews with Australian householders, we found that an initially high level of engagement gave way over time to disinterest, neglect and in certain cases, technical malfunction. Additionally, preconceptions concerned with the “purpose” of the feedback were found to affect use. We propose expanding the scope of enquiry for eco-feedback in several ways, and describe how eco-feedback that better supports decision-making in the “maintenance phase”, i.e. once the initial novelty has worn off, may be key to longer term engagement.
Resumo:
Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd. The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship. The Copyright Act 1968 (Cth) does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998), the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.