139 resultados para Irritability and movements
Resumo:
The price formation of financial assets is a complex process. It extends beyond the standard economic paradigm of supply and demand to the understanding of the dynamic behavior of price variability, the price impact of information, and the implications of trading behavior of market participants on prices. In this thesis, I study aggregate market and individual assets volatility, liquidity dimensions, and causes of mispricing for US equities over a recent sample period. How volatility forecasts are modeled, what determines intradaily jumps and causes changes in intradaily volatility and what drives the premium of traded equity indexes? Are they induced, for example, by the information content of lagged volatility and return parameters or by macroeconomic news, changes in liquidity and volatility? Besides satisfying our intellectual curiosity, answers to these questions are of direct importance to investors developing trading strategies, policy makers evaluating macroeconomic policies and to arbitrageurs exploiting mispricing in exchange-traded funds. Results show that the leverage effect and lagged absolute returns improve forecasts of continuous components of daily realized volatility as well as jumps. Implied volatility does not subsume the information content of lagged returns in forecasting realized volatility and its components. The reported results are linked to the heterogeneous market hypothesis and demonstrate the validity of extending the hypothesis to returns. Depth shocks, signed order flow, the number of trades, and resiliency are the most important determinants of intradaily volatility. In contrast, spread shock and resiliency are predictive of signed intradaily jumps. There are fewer macroeconomic news announcement surprises that cause extreme price movements or jumps than those that elevate intradaily volatility. Finally, the premium of exchange-traded funds is significantly associated with momentum in net asset value and a number of liquidity parameters including the spread, traded volume, and illiquidity. The mispricing of industry exchange traded funds suggest that limits to arbitrage are driven by potential illiquidity.
Resumo:
Football, or soccer as it is more commonly referred to in Australia and the US, is arguably the world’s most popular sport. It generates a proportionate volume of related writing. Within this landscape, works of novel-length fiction are seemingly rare. This paper establishes and maps a substantial body of football fiction works, explores elements and qualities exhibited individually and collectively. In bringing together current, limited surveys of the field, it presents the first rigorous definition of football fiction and captures the first historiography of the corpus. Drawing on distant reading methods developed in conjunction with closer textual analyses, the historiography and subsequent taxonomy represent the first articulation of relationships across the body of work, identify growth areas and establish a number of movements and trends. In advancing the understanding of football fiction as a collective body, the paper lays foundations for further research and consideration of the works in generic terms.
Resumo:
It is exciting to be living at a time when the big questions in biology can be investigated using modern genetics and computing [1]. Bauzà-Ribot et al.[2] take on one of the fundamental drivers of biodiversity, the effect of continental drift in the formation of the world’s biota 3 and 4, employing next-generation sequencing of whole mitochondrial genomes and modern Bayesian relaxed molecular clock analysis. Bauzà-Ribot et al.[2] conclude that vicariance via plate tectonics best explains the genetic divergence between subterranean metacrangonyctid amphipods currently found on islands separated by the Atlantic Ocean. This finding is a big deal in biogeography, and science generally [3], as many other presumed biotic tectonic divergences have been explained as probably due to more recent transoceanic dispersal events [4]. However, molecular clocks can be problematic 5 and 6 and we have identified three issues with the analyses of Bauzà-Ribot et al.[2] that cast serious doubt on their results and conclusions. When we reanalyzed their mitochondrial data and attempted to account for problems with calibration 5 and 6, modeling rates across branches 5 and 7 and substitution saturation [5], we inferred a much younger date for their key node. This implies either a later trans-Atlantic dispersal of these crustaceans, or more likely a series of later invasions of freshwaters from a common marine ancestor, but either way probably not ancient tectonic plate movements.
Resumo:
A technique for analysing exhaust emission plumes from unmodified locomotives under real world conditions is described and applied to the task of characterizing plumes from railway trains servicing an Australian shipping port. The method utilizes the simultaneous measurement, downwind of the railway line, of the following pollutants; particle number, PM2.5 mass fraction, SO2, NOx and CO2, with the last of these being used as an indicator of fuel combustion. Emission factors are then derived, in terms of number of particles and mass of pollutant emitted per unit mass of fuel consumed. Particle number size distributions are also presented. The practical advantages of the method are discussed including the capacity to routinely collect emission factor data for passing trains and to thereby build up a comprehensive real world database for a wide range of pollutants. Samples from 56 train movements were collected, analyzed and presented. The quantitative results for emission factors are: EF(N)=(1.7±1)×1016 kg-1, EF(PM2.5)= (1.1±0.5) g·kg-1, EF(NOx)= (28±14) g·kg-1, and EF(SO2 )= (1.4±0.4) g·kg-1. The findings are compared with comparable previously published work. Statistically significant (p<α, α=0.05) correlations within the group of locomotives sampled were found between the emission factors for particle number and both SO2 and NOx.
Resumo:
The Land Of Ludos is a proposal or a design concept for a game that re-imagines the recorded Bluetooth device movements from the 2011 48 Hour Game Making Challenge as an interactive narrative experience. As game developers, the most interesting elements of the 48 Hour challenge data visualisation project is not measurement or analysis of process, but the relationships and narratives created during the experience. [exerpt truna aka j.turner, Thomas & Owen, 2013] See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, in proc IE'2013, 9th Australasian Conference on Interactive Entertainment, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
In recent years, restorative justice has surfaced as a new criminal justice practice in diverse parts of the world. Often, it appears that these practices have emerged in complete isolation from one another. This prompts us to question what it is that has allowed restorative justice to become an acceptable way of dealing with criminal justice issues, or in Foucault's terms, the ‘conditions of emergence’ of restorative justice. This article explores one of numerous potential ‘conditions of emergence’ of restorative justice — the discourses of the ‘therapeutic’, ‘recovery’, ‘self-help’ and ‘New Age’ movements. It aims to investigate the ways in which the taken-for-granted nature of these discourses have, in part, permitted restorative practices to become an approved way of ‘doing justice’.
Resumo:
It is widely accepted in the literature on restorative justice that restorative practices emerged at least partly as a result of the recent shift towards recognising the rights of victims of crime, and increasing the involvement of victims in the criminal justice system. This article seeks to destabilise this claim. Although it accepts that there is a relationship between the emergence of a strong victims' rights movement and the emergence of restorative justice, it argues that this relationship is more nuanced, complex and contingent than advocates of restorative justice allow.
Resumo:
As researchers interested in the pursuit of high quality/high equity literacy learning outcomes, we focus on the learning experiences of five early years French students, with a special regard for those who are already considered as being at-risk of educational failure. We narrow the empirical focus to a single lesson on a mechanical concept of print, that is matching lower and upper case alphabet letters. In doing so, we examine a deeply philosophical question: Which pedagogical practices dis/enable what sorts of early years students as literacy learners? We extend Cazden’s (2006) notion of ‘weaving’ knowledge across dimensions of knowing to describe how the case study teacher ‘weaves’ visible and invisible pedagogies over the four movements of a lesson. The findings reveal different pedagogical framings (Bernstein, 1996) have potentially different cognitive and social effects that constitute different kinds of literacy knowledge and oppressive subject positions for at-risk students (Young, 1990).
Resumo:
This paper introduces an improved line tracker using IMU and vision data for visual servoing tasks. We utilize an Image Jacobian which describes motion of a line feature to corresponding camera movements. These camera motions are estimated using an IMU. We demonstrate impacts of the proposed method in challenging environments: maximum angular rate ~160 0/s, acceleration ~6m /s2 and in cluttered outdoor scenes. Simulation and quantitative tracking performance comparison with the Visual Servoing Platform (ViSP) are also presented.
Resumo:
Although Parkinson’s disease (PD) is a complex disease for which appropriate nutrition management is important, limited evidence is currently available to support dietetic practice. Existing PD-specific guidelines do not span all phases of the Nutrition Care Process (NCP). This study aimed to document PD-specific nutrition management practice by Australian and Canadian dietitians. DAA members and PEN subscribers were invited to participate in an online survey (late 2011). Eighty-four dietitians responded (79.8% Australian). The majority (70.2%) worked in the clinical setting. Existing non-PD guidelines were used by 52.4% while 53.6% relied on self-initiated literature reviews. Weight loss/malnutrition, protein intake, dysphagia and constipation were common issues in all NCP phases. Respondents also requested more information/evidence for these topics. Malnutrition screening (82.1%) and assessment (85.7%) were routinely performed. One-third did not receive referrals for weight loss for overweight/obesity. Protein intake meeting gender/age recommendations (69.0%), and high energy/high protein diets to manage malnutrition (82.1%) were most commonly used. Constipation management was through high fibre diets (86.9%). Recommendations for spacing of meals and PD medications varied with 34.5% not making recommendations. Nutritional diagnosis (70.2%) and stage of disease (61.9%) guided monitoring frequency. Common outcome measures included appropriate weight change (97.6%) and regular bowel movements (88.1%). With limited PD-specific guidance, dietitians applied best available evidence for other groups with similar issues. Dietitians requested evidence-based guidelines specifically for the nutritional management of PD. Guideline development should focus on those areas reported as commonly encountered. This process can identify the gaps in evidence to guide future research.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
With the advancement of new technologies, this author has in 2010 started to engineer an online learning environment for investigating the nature and development of spatial abilities, and the teaching and learning of geometry. This paper documents how this new digital learning environment can afford the opportunity to integrate the learning about 3D shapes with direction, location and movement, and how young children can mentally and visually construct virtual 3D shapes using movements in both egocentric and fixed frames of reference (FOR). Findings suggest that year 4 (aged 9) children can develop the capacity to construct a cube using egocentric FOR only, fixed FOR only or a combination of both FOR. However, these young participants were unable to articulate the effect of individual or combined FOR movements. Directions for future research are proposed.
Resumo:
The integration of separate, yet complimentary, cortical pathways appears to play a role in visual perception and action when intercepting objects. The ventral system is responsible for object recognition and identification, while the dorsal system facilitates continuous regulation of action. This dual-system model implies that empirically manipulating different visual information sources during performance of an interceptive action might lead to the emergence of distinct gaze and movement pattern profiles. To test this idea, we recorded hand kinematics and eye movements of participants as they attempted to catch balls projected from a novel apparatus that synchronised or de-synchronised accompanying video images of a throwing action and ball trajectory. Results revealed that ball catching performance was less successful when patterns of hand movements and gaze behaviours were constrained by the absence of advanced perceptual information from the thrower's actions. Under these task constraints, participants began tracking the ball later, followed less of its trajectory, and adapted their actions by initiating movements later and moving the hand faster. There were no performance differences when the throwing action image and ball speed were synchronised or de-synchronised since hand movements were closely linked to information from ball trajectory. Results are interpreted relative to the two-visual system hypothesis, demonstrating that accurate interception requires integration of advanced visual information from kinematics of the throwing action and from ball flight trajectory.
Resumo:
Royal commissions are approached not as exercises in legitimation and closure but as sites of struggle that are heavily traversed by power holders yet are open to the voices of alternative and unofficial social groups, social movements, and individuals. Three case studies are discussed that highlight the hegemony of the legal methodology and discourse that dominate many inquiries. The first case, involving a single-case miscarriage inquiry, involves a man who was accused, convicted, and served a prison sentence for the murder of his wife. Nineteen years following the murder another man confessed to the crime. The official inquiry found that nothing had gone wrong in the criminal justice process; it had operated as it should. Thus, in the face of evidence that the criminal justice process may be flawed, the discursive strategy became one of silence; no explanation was offered except for the declaration that nothing had gone wrong. The fallibility of the criminal justice system was thus hidden from public view. The second case study examines the Wood Royal Commission into corruption charges within the NSW Police Service. The royal commission revealed a bevy of police misconduct offenses including process corruption, improper associations, theft, and substance abuse, among others. The author discusses the ways in which the other criminal justice players, the judiciary and prosecuting attorneys, emerge only briefly as potential ethical agents in relation to police misconduct and corruption and then abruptly disappear again. Yet, these other players are absolved of any responsibility for police misconduct. The third case study involves a spin-off inquiry into the facts surrounding the Leigh Leigh rape and murder case. This case illustrates how official inquires can seek to exclude non-traditional viewpoints and methodologies; in this case, the views of a feminist criminologist. The third case also illustrates how the adversarial process within the legal system allows those with power to subjugate the viewpoints of others through the legitimate use of cross-examination. These three case studies reveal how official inquiries tend to speak from an “idealized conception of justice” and downplay any viewpoint that questions this idealized version of the truth.
Resumo:
Covertly tracking mobile targets, either animal or human, in previously unmapped outdoor natural environments using off-road robotic platforms requires both visual and acoustic stealth. Whilst the use of robots for stealthy surveillance is not new, the majority only consider navigation for visual covertness. However, most fielded robotic systems have a non-negligible acoustic footprint arising from the onboard sensors, motors, computers and cooling systems, and also from the wheels interacting with the terrain during motion. This time-varying acoustic signature can jeopardise any visual covertness and needs to be addressed in any stealthy navigation strategy. In previous work, we addressed the initial concepts for acoustically masking a tracking robot’s movements as it travels between observation locations selected to minimise its detectability by a dynamic natural target and ensuring con- tinuous visual tracking of the target. This work extends the overall concept by examining the utility of real-time acoustic signature self-assessment and exploiting shadows as hiding locations for use in a combined visual and acoustic stealth framework.