750 resultados para Denver
Resumo:
Long-term changes in the genetic composition of a population occur by the fixation of new mutations, a process known as substitution. The rate at which mutations arise in a population and the rate at which they are fixed are expected to be equal under neutral conditions (Kimura, 1968). Between the appearance of a new mutation and its eventual fate of fixation or loss, there will be a period in which it exists as a transient polymorphism in the population (Kimura and Ohta, 1971). If the majority of mutations are deleterious (and nonlethal), the fixation probabilities of these transient polymorphisms are reduced and the mutation rate will exceed the substitution rate (Kimura, 1983). Consequently, different apparent rates may be observed on different time scales of the molecular evolutionary process (Penny, 2005; Penny and Holmes, 2001). The substitution rate of the mitochondrial protein-coding genes of birds and mammals has been traditionally recognized to be about 0.01 substitutions/site/million years (Myr) (Brown et al., 1979; Ho, 2007; Irwin et al., 1991; Shields and Wilson, 1987), with the noncoding D-loop evolving several times more quickly (e.g., Pesole et al., 1992; Quinn, 1992). Over the past decade, there has been mounting evidence that instantaneous mutation rates substantially exceed substitution rates, in a range of organisms (e.g., Denver et al., 2000; Howell et al., 2003; Lambert et al., 2002; Mao et al., 2006; Mumm et al., 1997; Parsons et al., 1997; Santos et al., 2005). The immediate reaction to the first of these findings was that the polymorphisms generated by the elevated mutation rate are short-lived, perhaps extending back only a few hundred years (Gibbons, 1998; Macaulay et al., 1997). That is, purifying selection was thought to remove these polymorphisms very rapidly.
Resumo:
This paper reports on an experiment that was conducted to determine the extent to which group dynamics impacts on the effectiveness of software development teams. The experiment was conducted on software engineering project students at the Queensland University of Technology (QUT).
Resumo:
High Dynamic Range (HDR) imaging was used to collect luminance information at workstations in 2 open-plan office buildings in Queensland, Australia: one lit by skylights, vertical windows and electric light, and another by skylights and electric light. This paper compares illuminance and luminance data collected in these offices with occupant feedback to evaluate these open-plan environments based on available and emerging metrics for visual comfort and glare. This study highlights issues of daylighting quality and measurement specific to open plan spaces. The results demonstrate that overhead glare is a serious threat to user acceptance of skylights, and that electric and daylight integration and controls have a major impact on the perception of daylighting quality. With regards to measurement of visual comfort it was found that the Daylight Glare Probability (DGP) gave poor agreement with occupant reports of discomfort glare in open-plan spaces with skylights, and the CIE Glare Index (CGI) gave the best agreement. Horizontal and vertical illuminances gave no indication of visual comfort in these spaces.
Resumo:
A new decision-making tool that will assist designers in the selection of appropriate daylighting solutions for buildings in tropical locations has been previously proposed by the authors. Through an evaluation matrix that prioritizes the parameters that best respond to the needs of tropical climates (e.g. reducing solar gain and protection from glare) the tool determines the most appropriate devices for specific climate and building inputs. The tool is effective in demonstrating the broad benefits and limitations of the different daylight strategies for buildings in the tropics. However for thorough analysis and calibration of the tool, validation is necessary. This paper presents a first step in the validation process. RADIANCE simulations were conducted to compare simulation performance with the performance predicted by the tool. To this end, an office building case study in subtropical Brisbane, Australia, and five different daylighting devices including openings, light guiding systems and light transport systems were simulated. Illuminance, light uniformity, daylight penetration and glare analysis were assessed for each device. The results indicate the tool can appropriately rank and recommend daylighting strategies based on specific building inputs for tropical and subtropical regions, making it a useful resource for designers.
Resumo:
The article focuses on the evidence-based information practice (EBIP) applied at the Auraria Library in Denver, Colorado during the reorganization of its technical services division. Collaboration processes were established for the technical services division through the reorganization and redefinition of workflows. There are several factors that form part of the redefinition of roles including personal interests, department needs, and library needs. A collaborative EBIP environment was created in the division by addressing issues of workplace hierarchies, by the distribution of problem solving, and by the encouragement of reflective dialogue.
'Information in context' : co-designing workplace structures and systems for organizational learning
Resumo:
With the aim of advancing professional practice through better understanding how to create workplace contexts that cultivate individual and collective learning through situated 'information in context' experiences, this paper presents insights gained from three North American collaborative design (co-design) implementations. In the current project at the Auraria Library in Denver, Colorado, USA, participants use collaborative information practices to redesign face-to-face and technology-enabled communication, decision making, and planning systems. Design processes are described and results-to-date described, within an appreciative framework which values information sharing and enables knowledge creation through shared leadership.
Resumo:
The purpose of this paper is to demonstrate the efficacy of collaborative evidence based information practice (EBIP) as an organizational effectiveness model. Shared leadership, appreciative inquiry and knowledge creation theoretical frameworks provide the foundation for change toward the implementation of a collaborative EBIP workplace model. Collaborative EBIP reiterates the importance of gathering the best available evidence, but it differs by shifting decision-making authority from "library or employer centric" to "user or employee centric". University of Colorado Denver Auraria Library Technical Services department created a collaborative EBIP environment by flattening workplace hierarchies, distributing problem solving and encouraging reflective dialogue. By doing so, participants are empowered to identify problems, create solutions, and become valued and respected leaders and followers. In an environment where library budgets are in jeopardy, recruitment opportunities are limited and the workplace is in constant flux, the Auraria Library case study offers an approach that maximizes the capability of the current workforce and promotes agile responsiveness to industry and organizational challenges. Collaborative EBIP is an organizational model demonstrating a process focusing first on the individual and moving to the collective to develop a responsive and high performing business unit, and in turn, organization.
Resumo:
Background Epidemiological studies have shown a reduced incidence of cardiovascular disease in the Mediterranean population attributed to the consumption of dietary olive oil rich in antioxidants. This has lead to increased interest in the antioxidant properties of other phenolic compounds of olive tree products. It has been suggested that olive leaf extract may also have health benefits due to its antioxidant and anti-inflammatory activities. Antioxidants can prevent the effects of oxidative metabolism by scavenging free radicals and decreasing the hyperactivity of platelets associated with the development of occlusive thrombosis. No studies to date have investigated the effects of olive leaf extract on platelet function to our knowledge. Improved understanding of the antioxidant properties of olive leaf extract and its effect on platelet function could lead to improved cardiovascular health. Objective The current study used an olive leaf extract prepared from the Olea europaea L. tree. The aim was to determine if polyphenols in olive leaf extract would reduce platelet activity and, to establish an optimal dose in vitro that would reduce platelet aggregation and ATP release. Design Eleven subjects with normal platelet counts (150–400 x 109/L) were recruited for the current in vitro study. Olive leaf extract was added to citrated whole blood to obtain five concentrations ranging from 5.4 ug/mL to 54.0 ug/mL for a dose response curve. Baseline samples, without olive leaf extract were used as a negative control for each subject. After 2 hours incubation with olive leaf extract samples were analyzed for platelet aggregation and ATP release from platelets stimulated by the addition of collagen. Results Whole blood analysis (n=11) showed a clear dose-dependant reduction in platelet aggregation with the increasing olive leaf extract concentrations (p<0.0001). There was also a similar decrease in ATP release from collagen stimulated platelets (p=0.02). Conclusion In the current study the olive leaf extract obtained from Olea europaea L. inhibited platelet aggregation and ATP release from collagen stimulated platelets in vitro. This study suggests olive leaf extract may prevent occlusive thrombosis by reducing platelet hyperactivity.
Resumo:
Social media platforms are of interest to interactive entertainment companies for a number of reasons. They can operate as a platform for deploying games, as a tool for communicating with customers and potential customers, and can provide analytics on how players utilize the; game providing immediate feedback on design decisions and changes. However, as ongoing research with Australian developer Halfbrick, creators of $2 , demonstrates, the use of these platforms is not universally seen as a positive. The incorporation of Big Data into already innovative development practices has the potential to cause tension between designers, whilst the platform also challenges the traditional business model, relying on micro-transactions rather than an up-front payment and a substantial shift in design philosophy to take advantage of the social aspects of platforms such as Facebook.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
This paper draws on comparative analyses of Twitter data sets – over time and across different kinds of natural disasters and different national contexts – to demonstrate the value of shared, cumulative approaches to social media analytics in the context of crisis communication.
Resumo:
This paper uses innovative content analysis techniques to map how the death of Oscar Pistorius' girlfriend, Reeva Steenkamp, was framed on Twitter conversations. Around 1.5 million posts from a two-week timeframe are analyzed with a combination of syntactic and semantic methods. This analysis is grounded in the frame analysis perspective and is different than sentiment analysis. Instead of looking for explicit evaluations, such as “he is guilty” or “he is innocent”, we showcase through the results how opinions can be identified by complex articulations of more implicit symbolic devices such as examples and metaphors repeatedly mentioned. Different frames are adopted by users as more information about the case is revealed: from a more episodic one, highly used in the very beginning, to more systemic approaches, highlighting the association of the event with urban violence, gun control issues, and violence against women. A detailed timeline of the discussions is provided.
Resumo:
In this paper, we explore the use of Twitter as a political tool in the 2013 Australian Federal Election. We employ a ‘big data’ approach that combines qualitative and quantitative methods of analysis. By tracking the accounts of politicians and parties, and the tweeting activity to and around these accounts, as well as conversations on particular hashtagged topics, we gain a comprehensive insight into the ways in which Twitter is employed in the campaigning strategies of different parties. We compare and contrast the use of Twitter by political actors with its adoption by citizens as a tool for political conversation and participation. Our study provides an important longitudinal counterpoint, and opportunity for comparison, to the use of Twitter in previous Australian federal and state elections. Furthermore, we offer innovative methodologies for data gathering and evaluation that can contribute to the comparative study of the political uses of Twitter across diverse national media and political systems.
Resumo:
In this paper, we provide an account-centric analysis of the tweeting activity of, and public response to, Pope Benedict XVI via the @pontifex Twitter account(s). We focus our investigation on the particular phase around Pope Benedict XVI’s resignation to generate insights into the use of Twitter in response to a celebrity crisis event. Through a combined qualitative and quantitative methodological approach we generate an overview of the follower-base and tweeting activity of the @pontifex account. We identify a very one-directional communication pattern (many @mentions by followers yet zero @replies from the papal account itself), which prompts us to enquire further into what the public resonance of the @pontifex account is. We also examine reactions to the resurrection of the papal Twitter account by Pope Benedict XVI’s successor. In this way, we provide a comprehensive analysis of the public response to the immediate events around the crisis event of Pope Benedict XVI’s resignation and its aftermath via the network of users involved in the @pontifex account.