870 resultados para Passing of time
Resumo:
This thesis explored the state of the use of e-learning tools within Learning Management Systems in higher education and developed a distinct framework to explain the factors influencing users' engagement with these tools. The study revealed that the Learning Management System design, preferences for other tools, availability of time, lack of adequate knowledge about tools, pedagogical practices, and social influences affect the uptake of Learning Management System tools. Semi structured interviews with 74 students and lecturers of a major Australian university were used as a source of data. The applied thematic analysis method was used to analyse the collected data.
Resumo:
Wind energy, being the fastest growing renewable energy source in the present world, requires a large number of wind turbines to transform wind energy into electricity. One factor driving the cost of this energy is the reliable operation of these turbines. Therefore, it is a growing requirement within the wind farm community, to monitor the operation of the wind turbines on a continuous basis so that a possible fault can be detected ahead of time. As the wind turbine operates in an environment of constantly changing wind speed, it is a challenging task to design a fault detection technique which can accommodate the stochastic operational behavior of the turbines. Addressing this issue, this paper proposes a novel fault detection criterion which is robust against operational uncertainty, as well as having the ability to quantify severity level specifically of the drivetrain abnormality within an operating wind turbine. A benchmark model of wind turbine has been utilized to simulate drivetrain fault condition and effectiveness of the proposed technique has been tested accordingly. From the simulation result it can be concluded that the proposed criterion exhibits consistent performance for drivetrain faults for varying wind speed and has linear relationship with the fault severity level.
Resumo:
Traditionally, it is not easy to carry out tests to identify modal parameters from existing railway bridges because of the testing conditions and complicated nature of civil structures. A six year (2007-2012) research program was conducted to monitor a group of 25 railway bridges. One of the tasks was to devise guidelines for identifying their modal parameters. This paper presents the experience acquired from such identification. The modal analysis of four representative bridges of this group is reported, which include B5, B15, B20 and B58A, crossing the Carajás railway in northern Brazil using three different excitations sources: drop weight, free vibration after train passage, and ambient conditions. To extract the dynamic parameters from the recorded data, Stochastic Subspace Identification and Frequency Domain Decomposition methods were used. Finite-element models were constructed to facilitate the dynamic measurements. The results show good agreement between the measured and computed natural frequencies and mode shapes. The findings provide some guidelines on methods of excitation, record length of time, methods of modal analysis including the use of projected channel and harmonic detection, helping researchers and maintenance teams obtain good dynamic characteristics from measurement data.
Resumo:
This article considers the integral role played by patent law in respect of stem cell research. It highlights concerns about commercialization, access to essential medicines and bioethics. The article maintains that there is a fundamental ambiguity in the Patents Act 1990 (Cth) as to whether stem cell research is patentable subject matter. There is a need to revise the legislation in light of the establishment of the National Stem Cell Centre and the passing of the Research Involving Embryos Act 2002 (Cth). The article raises concerns about the strong patent protection secured by the Wisconsin Alumni Research Foundation and Geron Corporation in respect of stem cell research in the United States. It contends that a number of legal reforms could safeguard access to stem cell lines, and resulting drugs and therapies. Finally, this article explores how ethical concerns are addressed within the framework of the European Biotechnology Directive. It examines the decision of the European Patent Office in relation to the so-called Edinburgh patent, and the inquiry of the European Group on Ethics in Science and New Technologies into The Ethical Aspects of Patenting Involving Human Stem Cells.
Resumo:
For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.
Resumo:
Pure phase Cu2ZnSnS4 (CZTS) nanoparticles were successfully synthesized via polyacrylic acid (PAA) assisted one-pot hydrothermal route. The morphology, crystal structure, composition and optical properties as well as the photoactivity of the as-synthesized CZTS nanoparticles were characterized by X-ray diffraction, Raman spectroscopy, scanning electron microscopy, transmission electron microscopy, X-ray photoelectron spectrometer, UV-visible absorption spectroscopy and photoelectrochemical measurement. The influence of various synthetic conditions, such as the reaction temperature, reaction duration and the amount of PAA in the precursor solution on the formation of CZTS compound was systematically investigated. The results have shown that the crystal phase, morphology and particle size of CZTS can be tailored by controlling the reaction conditions. The formation mechanism of CZTS in the hydrothermal reaction has been proposed based on the investigation of time-dependent phase evolution of CZTS which showed that metal sulfides (e.g., Cu2S, SnS2 and ZnS) were formed firstly during the hydrothermal reaction before forming CZTS compound through nucleation. The band gap of the as-synthesized CZTS nanoparticles is 1.49 eV. The thin film electrode based on the synthesized CZTS nanoparticles in a three-electrode photoelectrochemical cell generated pronounced photocurrent under illumination provided by a red light-emitting diode (LED, 627 nm), indicating the photoactivity of the semiconductor material.
Resumo:
This study examined the role of information, efficacy, and 3 stressors in predicting adjustment to organizational change. Participants were 589 government employees undergoing an 18-month process of regionalization. To examine if the predictor variables had long-term effects on adjustment, the authors assessed psychological well-being, client engagement, and job satisfaction again at a 2-year follow-up. At Time 1, there was evidence to suggest that information was indirectly related to psychological well-being, client engagement, and job satisfaction, via its positive relationship to efficacy. There also was evidence to suggest that efficacy was related to reduced stress appraisals, thereby heightening client engagement. Last, there was consistent support for the stress-buffering role of Time 1 self-efficacy in the prediction of Time 2 job satisfaction.
Resumo:
This paper examines the global policy convergence toward high-stakes testing in schools and the use of test results to ‘steer at a distance’, particularly as it applies to policy-makers’ promise to improve teacher quality. Using Deleuze’s three syntheses of time in the context of the Australian policy blueprint Quality Education, this paper argues that using test scores to discipline teaching repeats the past habit of policy-making as continuing the problem of the unaccountable teacher. This results in local policy-making enfolding test scores in a pure past where the teacher-as-problem is resolved through the use of data from testing to deliver accountability and transparency. This use of the database returns a digitised form of inspection that is a repetition of the habit of teacher-as-problem. While dystopian possibilities are available through the database, in what Deleuze refers to as a control society, for us the challenge is to consider policy-making as a step into an unknown future, to engage with producing policy that is not grounded on the unconscious interiority of solving the teacher problem, but of imagining new ways of conceiving the relationship between policy-making and teaching.
Resumo:
In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.
Resumo:
There are currently 23,500 level crossings in Australia, broadly divided into one of two categories: active level crossings which are fully automatic and have boom barriers, alarm bells, flashing lights, and pedestrian gates; and passive level crossings, which are not automatic and aim to control road and pedestrianised walkways solely with stop and give way signs. Active level crossings are considered to be the gold standard for transport ergonomics when grade separation (i.e. constructing an over- or underpass) is not viable. In Australia, the current strategy is to annually upgrade passive level crossings with active controls but active crossings are also associated with traffic congestion, largely as a result of extended closure times. The percentage of time level crossings are closed to road vehicles during peak periods increases with the rise in the frequency of train services. The popular perception appears to be that once a level crossing is upgraded, one is free to wipe their hands and consider the job done. However, there may also be environments where active protection is not enough, but where the setting may not justify the capital costs of grade separation. Indeed, the associated congestion and traffic delay could compromise safety by contributing to the risk taking behaviour by motorists and pedestrians. In these environments it is important to understand what human factor issues are present and ask the question of whether a one size fits all solution is indeed the most ergonomically sound solution for today’s transport needs.
Resumo:
As a new research method supplementing the existing qualitative and quantitative approaches, agent-based modelling and simulation (ABMS) may fit well within the entrepreneurship field because the core concepts and basic premises of entrepreneurship coincide with the characteristics of ABMS (McKelvey, 2004; Yang & Chandra, 2013). Agent-based simulation is a simulation method based on agent-based models. The agentbased models are composed of heterogeneous agents and their behavioural rules. By repeatedly carrying out agent-based simulations on a computer, the simulations reproduce each agent’s behaviour, their interactive process, and the emerging macroscopic phenomenon according to the flow of time. Using agent-based simulations, researchers may investigate temporal or dynamic effects of each agent’s behaviours.
Resumo:
Traffic law enforcement sanctions can impact on road user behaviour through general and specific deterrence mechanisms. The manner in which specific deterrence can influence recidivist behaviour can be conceptualised in different ways. While any reduction in speeding will have road safety benefits, the ways in which a ‘reduction’ is determined deserves greater methodological attention and has implications for countermeasure evaluation more generally. The primary aim of this research was to assess the specific deterrent impact of penalty increases for speeding offences in Queensland, Australia, in 2003 on two cohorts of drivers detected for speeding prior to and after the penalty changes were investigated. Since the literature is relatively silent on how to assess recidivism in the speeding context, the secondary research aim was to contribute to the literature regarding ways to conceptualise and measure specific deterrence in the speeding context. We propose a novel way of operationalising four measures which reflect different ways in which a specific deterrence effect could be conceptualised: (1) the proportion of offenders who re-offended in the follow up period; (2) the overall frequency of re-offending in the follow up period; (3) the length of delay to re-offence among those who re-offended; and (4) the average number of re-offences during the follow up period among those who re-offended. Consistent with expectations, results suggested an absolute deterrent effect of penalty changes, as evidenced by significant reductions in the proportion of drivers who re-offended and the overall frequency of re-offending, although effect sizes were small. Contrary to expectations, however, there was no evidence of a marginal specific deterrent effect among those who re-offended, with a significant reduction in the length of time to re-offence and no significant change in the average number of offences committed. Additional exploratory analyses investigating potential influences of the severity of the index offence, offence history, and method of detection revealed mixed results. Access to additional data from various sources suggested that the main findings were not influenced by changes in speed enforcement activity, public awareness of penalty changes, or driving exposure during the study period. Study limitations and recommendations for future research are discussed with a view to promoting more extensive evaluations of penalty changes and better understanding of how such changes may impact on motorists’ perceptions of enforcement and sanctions, as well as on recidivist behaviour.
Resumo:
Determining the genetic bases of adaptations and their roles in speciation is a prominent issue in evolutionary biology. Cichlid fish species flocks are a prime example of recent rapid radiations, often associated with adaptive phenotypic divergence from a common ancestor within a short period of time. In several radiations of freshwater fishes, divergence in ecomorphological traits - including body shape, colour, lips and jaws - is thought to underlie their ecological differentiation, specialization and, ultimately, speciation. The Midas cichlid species complex (Amphilophus spp.) of Nicaragua provides one of the few known examples of sympatric speciation where species have rapidly evolved different but parallel morphologies in young crater lakes. This study identified significant QTL for body shape using SNPs generated via ddRAD sequencing and geometric morphometric analyses of a cross between two ecologically and morphologically divergent, sympatric cichlid species endemic to crater Lake Apoyo: an elongated limnetic species (Amphilophus zaliosus) and a high-bodied benthic species (Amphilophus astorquii). A total of 453 genome-wide informative SNPs were identified in 240 F-2 hybrids. These markers were used to construct a genetic map in which 25 linkage groups were resolved. Seventy-two segregating SNPs were linked to 11 QTL. By annotating the two most highly supported QTL-linked genomic regions, genes that might contribute to divergence in body shape along the benthic-limnetic axis in Midas cichlid sympatric adaptive radiations were identified. These results suggest that few genomic regions of large effect contribute to early stage divergence in Midas cichlids.
Resumo:
The concept of energy gap(s) is useful for understanding the consequence of a small daily, weekly, or monthly positive energy balance and the inconspicuous shift in weight gain ultimately leading to overweight and obesity. Energy gap is a dynamic concept: an initial positive energy gap incurred via an increase in energy intake (or a decrease in physical activity) is not constant, may fade out with time if the initial conditions are maintained, and depends on the 'efficiency' with which the readjustment of the energy imbalance gap occurs with time. The metabolic response to an energy imbalance gap and the magnitude of the energy gap(s) can be estimated by at least two methods, i.e. i) assessment by longitudinal overfeeding studies, imposing (by design) an initial positive energy imbalance gap; ii) retrospective assessment based on epidemiological surveys, whereby the accumulated endogenous energy storage per unit of time is calculated from the change in body weight and body composition. In order to illustrate the difficulty of accurately assessing an energy gap we have used, as an illustrative example, a recent epidemiological study which tracked changes in total energy intake (estimated by gross food availability) and body weight over 3 decades in the US, combined with total energy expenditure prediction from body weight using doubly labelled water data. At the population level, the study attempted to assess the cause of the energy gap purported to be entirely due to increased food intake. Based on an estimate of change in energy intake judged to be more reliable (i.e. in the same study population) and together with calculations of simple energetic indices, our analysis suggests that conclusions about the fundamental causes of obesity development in a population (excess intake vs. low physical activity or both) is clouded by a high level of uncertainty.
Resumo:
This paper investigates the platoon dispersion model that is part of the 2010 Highway Capacity Manual that is used for forecasting downstream traffic flows for analyzing both signalized and TWSC intersections. The paper focuses on the effect of platoon dispersion on the proportion of time blocked, the conflicting flow rate, and the capacity flow rate for the major street left turn movement at a TWSC intersection. The existing HCM 2010 methodology shows little effect on conflicting flow or capacity for various distances downstream from the signalized intersection. Two methods are suggested for computing the conflicting flow and capacity of minor stream movements at the TWSC intersection that have more desirable properties than the existing HCM method. Further, if the existing HCM method is retained, the results suggest that the upstream signals model be dropped from the HCM method for TWSC intersections.