878 resultados para sets of words
Resumo:
We compared changes in markers of muscle damage and systemic inflammation after submaximal and maximal lengthening muscle contractions of the elbow flexors. Using a cross-over design, 10 healthy young men not involved in resistance training completed a submaximal trial (10 sets of 60 lengthening contractions at 10% maximum isometric strength, 1 min rest between sets), followed by a maximal trial (10 sets of three lengthening contractions at 100% maximum isometric strength, 3 min rest between sets). Lengthening contractions were performed on an isokinetic dynamometer. Opposite arms were used for the submaximal and maximal trials, and the trials were separated by a minimum of two weeks. Blood was sampled before, immediately after, 1 h, 3 h, and 1-4 d after each trial. Total leukocyte and neutrophil numbers, and the serum concentration of soluble tumor necrosis factor-alpha receptor 1 were elevated after both trials (P < 0.01), but there were no differences between the trials. Serum IL-6 concentration was elevated 3 h after the submaximal contractions (P < 0.01). The concentrations of serum tumor necrosis factor-alpha, IL-1 receptor antagonist, IL-10, granulocyte-colony stimulating factor and plasma C-reactive protein remained unchanged following both trials. Maximum isometric strength and range of motion decreased significantly (P < 0.001) after both trials, and were lower from 1-4 days after the maximal contractions compared to the submaximal contractions. Plasma myoglobin concentration and creatine kinase activity, muscle soreness and upper arm circumference all increased after both trials (P < 0.01), but were not significantly different between the trials. Therefore, there were no differences in markers of systemic inflammation, despite evidence of greater muscle damage following maximal versus submaximal lengthening contractions of the elbow flexors.
Resumo:
Most learning paradigms impose a particular syntax on the class of concepts to be learned; the chosen syntax can dramatically affect whether the class is learnable or not. For classification paradigms, where the task is to determine whether the underlying world does or does not have a particular property, how that property is represented has no implication on the power of a classifier that just outputs 1’s or 0’s. But is it possible to give a canonical syntactic representation of the class of concepts that are classifiable according to the particular criteria of a given paradigm? We provide a positive answer to this question for classification in the limit paradigms in a logical setting, with ordinal mind change bounds as a measure of complexity. The syntactic characterization that emerges enables to derive that if a possibly noncomputable classifier can perform the task assigned to it by the paradigm, then a computable classifier can also perform the same task. The syntactic characterization is strongly related to the difference hierarchy over the class of open sets of some topological space; this space is naturally defined from the class of possible worlds and possible data of the learning paradigm.
Resumo:
Many ageing road bridges, particularly timber bridges, require urgent improvement due to the demand imposed by the recent version of the Australian bridge loading code, AS 5100. As traffic volume plays a key role in the decision of budget allocations for bridge refurbishment/ replacement, many bridges in low volume traffic network remain in poor condition with axle load and/ or speed restrictions, thus disadvantaging many rural communities. This thesis examines an economical and environmentally sensible option of incorporating disused flat rail wagons (FRW) in the construction of bridges in low volume, high axle load road network. The constructability, economy and structural adequacy of the FRW road bridge is reported in the thesis with particular focus of a demonstration bridge commissioned in regional Queensland. The demonstration bridge comprises of a reinforced concrete slab (RCS) pavement resting on two FRWs with custom designed connection brackets at regular intervals along the span of the bridge. The FRW-RC bridge deck assembly is supported on elastomeric rubber pads resting on the abutment. As this type of bridge replacement technology is new and its structural design is not covered in the design standards, the in-service structural performance of the FRW bridge subjected to the high axle loadings prescribed in AS 5100 is examined through performance load testing. Both the static and the moving load tests are carried out using a fully laden commonly available three-axle tandem truck. The bridge deck is extensively strain gauged and displacement at several key locations is measured using linear variable displacement transducers (LVDTs). A high speed camera is used in the performance test and the digital image data are analysed using proprietary software to capture the locations of the wheel positions on the bridge span accurately. The wheel location is thus synchronised with the displacement and strain time series to infer the structural response of the FRW bridge. Field test data are used to calibrate a grillage model, developed for further analysis of the FRW bridge to various sets of high axle loads stipulated in the bridge design standard. Bridge behaviour predicted by the grillage model has exemplified that the live load stresses of the FRW bridge is significantly lower than the yield strength of steel and the deflections are well below the serviceability limit state set out in AS 5100. Based on the results reported in this thesis, it is concluded that the disused FRWs are competent to resist high axle loading prescribed in AS 5100 and are a viable alternative structural solution of bridge deck in the context of the low volume road networks.
Resumo:
This thesis explores the business environment for self-publishing musicians at the end of the 20th century and the start of the 21st century from theoretical and empirical standpoints. The exploration begins by asking three research questions: what are the factors affecting the sustainability of an Independent music business; how many of those factors can be directly influenced by an Independent musician in the day-to-day operations of their musical enterprise; and how can those factors be best manipulated to maximise the benefit generated from digital music assets? It answers these questions by considering the nature of value in the music business in light of theories of political economy, then quantitative and qualitative examinations of the nature of participation in the music business, and then auto-ethnographic approaches to the application of two technologically enabled tools available to Independent musicians. By analyzing the results of five different examinations of the topic it answers each research question with reference to four sets of recurring issues that affect the operations of a 21st century music business: the musicians’ personal characteristics, their ability to address their business’s informational needs; their ability to manage the relationships upon which their business depends; and their ability to resolve the remaining technological problems that confront them. It discusses ways in which Independent self-publishing musicians can and cannot deal with these four issues on a day-to-day basis and highlights aspects for which technological solutions do not exist as well as ways in which technology is not as effective as has been claimed. It then presents a self-critique and proposes some directions for further study before concluding by suggesting some common features of 21st century Independent music businesses. This thesis makes three contributions to knowledge. First, it provides a new understanding of the sources of musical value, shows how this explains changes in the music industries over the past 30 years, and provides a framework for predicting future developments in those industries. Second, it shows how the technological discontinuity that has occurred around the start of the 21st century has and has not affected the production and distribution of digital cultural artefacts and thus the attitudes, approaches, and business prospects of Independent musicians. Third, it argues for new understandings of two methods by which self-publishing musicians can grow a business using production methods that are only beginning to be more broadly understood: home studio recording and fan-sourced production. Developed from the perspective of working musicians themselves, this thesis identifies four sets of issues that determine the probable success of musicians’ efforts to adopt new technologies to capture the value of the musicians’ creativity and thereby foster growth that will sustain an Independent music business in the 21st century.
Resumo:
The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.
Resumo:
The mineral schlossmacherite (H3O,Ca)Al3(AsO4,PO4,SO4)2(OH)6 , a multi-cation-multi-anion mineral of the beudantite mineral subgroup has been characterised by Raman spectroscopy. The mineral and related minerals functions as a heavy metal collector and is often amorphous or poorly crystalline, such that XRD identification is difficult. The Raman spectra are dominated by an intense band at 864 cm-1, assigned to the symmetric stretching mode of the AsO43- anion. Raman bands at 809 and 819 cm-1 are assigned to the antisymmetric stretching mode of AsO43- . The sulphate anion is characterised by bands at 1000 cm-1 (ν1), and at 1031, 1082 and 1139 cm-1 (ν3). Two sets of bands in the OH stretching region are observed: firstly between 2800 and 3000 cm-1 with bands observed at 2850, 2868, 2918 cm-1 and secondly between 3300 and 3600 with bands observed at 3363, 3382, 3410, 3449 and 3537 cm-1. These bands enabled the calculation of hydrogen bond distances and show a wide range of H-bond distances.
Resumo:
The quality assurance of stereotactic radiotherapy and radiosurgery treatments requires the use of small-field dose measurements that can be experimentally challenging. This study used Monte Carlo simulations to establish that PAGAT dosimetry gel can be used to provide accurate, high resolution, three-dimensional dose measurements of stereotactic radiotherapy fields. A small cylindrical container (4 cm height, 4.2 cm diameter) was filled with PAGAT gel, placed in the parietal region inside a CIRS head phantom, and irradiated with a 12 field stereotactic radiotherapy plan. The resulting three-dimensional dose measurement was read out using an optical CT scanner and compared with the treatment planning prediction of the dose delivered to the gel during the treatment. A BEAMnrc DOSXYZnrc simulation of this treatment was completed, to provide a standard against which the accuracy of the gel measurement could be gauged. The three dimensional dose distributions obtained from Monte Carlo and from the gel measurement were found to be in better agreement with each other than with the dose distribution provided by the treatment planning system's pencil beam calculation. Both sets of data showed close agreement with the treatment planning system's dose distribution through the centre of the irradiated volume and substantial disagreement with the treatment planning system at the penumbrae. The Monte Carlo calculations and gel measurements both indicated that the treated volume was up to 3 mm narrower, with steeper penumbrae and more variable out-of-field dose, than predicted by the treatment planning system. The Monte Carlo simulations allowed the accuracy of the PAGAT gel dosimeter to be verified in this case, allowing PAGAT gel to be utilised in the measurement of dose from stereotactic and other radiotherapy treatments, with greater confidence in the future.
Resumo:
The primary objective of the experiments reported here was to demonstrate the effects of opening up the design envelope for auditory alarms on the ability of people to learn the meanings of a set of alarms. Two sets of alarms were tested, one already extant and one newly-designed set for the same set of functions, designed according to a rationale set out by the authors aimed at increasing the heterogeneity of the alarm set and incorporating some well-established principles of alarm design. For both sets of alarms, a similarity-rating experiment was followed by a learning experiment. The results showed that the newly-designed set was judged to be more internally dissimilar, and easier to learn, than the extant set. The design rationale outlined in the paper is useful for design purposes in a variety of practical domains and shows how alarm designers, even at a relatively late stage in the design process, can improve the efficacy of an alarm set.
Resumo:
In this paper, we propose three meta-heuristic algorithms for the permutation flowshop (PFS) and the general flowshop (GFS) problems. Two different neighborhood structures are used for these two types of flowshop problem. For the PFS problem, an insertion neighborhood structure is used, while for the GFS problem, a critical-path neighborhood structure is adopted. To evaluate the performance of the proposed algorithms, two sets of problem instances are tested against the algorithms for both types of flowshop problems. The computational results show that the proposed meta-heuristic algorithms with insertion neighborhood for the PFS problem perform slightly better than the corresponding algorithms with critical-path neighborhood for the GFS problem. But in terms of computation time, the GFS algorithms are faster than the corresponding PFS algorithms.
Resumo:
Aims Multi-method study including two parts: Study One: three sets of observations in two regional areas of Queensland Study Two: two sets of parent intercept interviews conducted in Toowoomba, Queensland. The aim of Study Two is to determine parents’ views, opinions and knowledge of child restraint practices and the Queensland legislative amendment.
Resumo:
Foreword: In this paper I call upon a praxiological approach. Praxeology (early alteration of praxiology) is the study of human action and conduct. The name praxeology/praxiologyakes is root in praxis, Medieval Latin, from Greek, doing, action, from prassein to do, practice (Merriam-Webster Dictionary). Having been involved in project management education, research and practice for the last twenty years, I have constantly tried to improve and to provide a better understanding/knowledge of the field and related practice, and as a consequence widen and deepen the competencies of the people I was working with (and my own competencies as well!), assuming that better project management lead to more efficient and effective use of resources, development of people and at the end to a better world. For some time I have perceived a need to clarify the foundations of the discipline of project management, or at least elucidate what these foundations could be. An immodest task, one might say! But not a neutral one! I am constantly surprised by the way the world (i.e., organizations, universities, students and professional bodies) sees project management: as a set of methods, techniques, tools, interacting with others fields – general management, engineering, construction, information systems, etc. – bringing some effective ways of dealing with various sets of problems – from launching a new satellite to product development through to organizational change.
Resumo:
The participation of the community broadcasting sector in the development of digital radio provides a potentially valuable opportunity for non-market, end user-driven experimentation in the development of these new services in Australia. However this development path is constrained by various factors, some of which are specific to the community broadcasting sector and others that are generic to the broader media and communications policy, industrial and technological context. This paper filters recent developments in digital radio policy and implementation through the perspectives of community radio stakeholders, obtained through interviews, to describe and analyse these constraints. The early stage of digital community radio presented here is intended as a baseline for tracking the development of the sector as digital radio broadcasting develops. We also draw upon insights from scholarly debates about citizens media and participatory culture to identify and discuss two sets of opportunities for social benefit that are enabled by the inclusion of community radio in digital radio service development. The first arises from community broadcasting’s involvement in the propagation of the multi-literacies that drive new digital economies, not only through formal and informal multi- and trans-media training, but also in the ‘co-creative’ forms of collaborative and participatory media production that are fostered in the sector. The second arises from the fact that community radio is uniquely placed — indeed charged with the responsibility — to facilitate social participation in the design and operation of media institutions themselves, not just their service outputs.
Resumo:
Driver response (reaction) time (tr) of the second queuing vehicle is generally longer than other vehicles at signalized intersections. Though this phenomenon was revealed in 1972, the above factor is still ignored in conventional departure models. This paper highlights the need for quantitative measurements and analysis of queuing vehicle performance in spontaneous discharge pattern because it can improve microsimulation. Video recording from major cities in Australia plus twenty two sets of vehicle trajectories extracted from the Next Generation Simulation (NGSIM) Peachtree Street Dataset have been analyzed to better understand queuing vehicle performance in the discharge process. Findings from this research will alleviate driver response time and also can be used for the calibration of the microscopic traffic simulation model.
Resumo:
Using Gray and McNaughton’s (2000) revised Reinforcement Sensitivity Theory (r-RST), we examined the influence of personality on processing of words presented in gain-framed and loss-framed anti-speeding messages and how the processing biases associated with personality influenced message acceptance. The r-RST predicts that the nervous system regulates personality and that behaviour is dependent upon the activation of the Behavioural Activation System (BAS), activated by reward cues and the Fight-Flight-Freeze System (FFFS), activated by punishment cues. According to r-RST, individuals differ in the sensitivities of their BAS and FFFS (i.e., weak to strong), which in turn leads to stable patterns of behaviour in the presence of rewards and punishments, respectively. It was hypothesised that individual differences in personality (i.e., strength of the BAS and the FFFS) would influence the degree of both message processing (as measured by reaction time to previously viewed message words) and message acceptance (measured three ways by perceived message effectiveness, behavioural intentions, and attitudes). Specifically, it was anticipated that, individuals with a stronger BAS would process the words presented in the gain-frame messages faster than those with a weaker BAS and individuals with a stronger FFFS would process the words presented in the loss-frame messages faster than those with a weaker FFFS. Further, it was expected that greater processing (faster reaction times) would be associated with greater acceptance for that message. Driver licence holding students (N = 108) were recruited to view one of four anti-speeding messages (i.e., social gain-frame, social loss-frame, physical gain-frame, and physical loss-frame). A computerised lexical decision task assessed participants’ subsequent reaction times to message words, as an indicator of the extent of processing of the previously viewed message. Self-report measures assessed personality and the three message acceptance measures. As predicted, the degree of initial processing of the content of the social gain-framed message mediated the relationship between the reward sensitive trait and message effectiveness. Initial processing of the physical loss-framed message partially mediated the relationship between the punishment sensitive trait and both message effectiveness and behavioural intention ratings. These results show that reward sensitivity and punishment sensitivity traits influence cognitive processing of gain-framed and loss-framed message content, respectively, and subsequently, message effectiveness and behavioural intention ratings. Specifically, a range of road safety messages (i.e., gain-frame and loss-frame messages) could be designed which align with the processing biases associated with personality and which would target those individuals who are sensitive to rewards and those who are sensitive to punishments.
Resumo:
Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.