128 resultados para Galla Placidia, Empress, approximately 386-450
Resumo:
Although timber plantations and forests are classified as forms of agricultural production, the ownership of this land classification is not limited to rural producers. Timber plantations and forests are now regarded as a long-term investment with both institutional and absentee owners. While the NCREIF property indices have been the benchmarks for the measurement of the performance of the commercial property market in the UK, for many years the IPD timberland index has recently emerged as the U.K. forest and timberland performance indicator. The IPD Forest index incorporates 126 properties over five regions in the U.K. This paper will utilise the IPD Forestry Index to examine the performance of U.K. timber plantations and forests over the period 1981-2004. In particular, issues to be critically assessed include plantation and forest performance analysis, comparative investment analysis, and the role of plantations and forests in investment portfolios, the risk reduction and portfolio benefits of plantations and forests in mixed-asset portfolios and the strategic investment significance of U.K. timberlands.
Resumo:
Timberland is now regarded as a long-term investment with both institutional investors and absentee owners. This paper utilises the NCREIF Timberland index to examine the performance of US timberland over the period of 1987-1999. US timberland was found to provide significant risk reduction and portfolio diversification benefits in the portfolio resulting from the low risk and low correlation with stocks and bonds. Timberland was also found to make a significant contribution to a portfolio of stocks, bonds and real estate, particularly at low to midrange portfolio risk levels.
Resumo:
Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.
Resumo:
This paper reports findings from a study investigating the effect of integrating sponsored and nonsponsored search engine links into a single web listing. The premise underlying this research is that web searchers are chiefly interested in relevant results. Given the reported negative bias that web searchers have concerning sponsored links, separate listings may be a disservice to web searchers as it might not direct them to relevant websites. Some web meta-search engines integrate sponsored and nonsponsored links into a single listing. Using a web search engine log of over 7 million interactions from hundreds of thousands of users from a major web meta-search engine, we analysed the click-through patterns for both sponsored and nonsponsored links. We also classified web queries as informational, navigational and transactional based on the expected type of content and analysed the click-through patterns of each classification. The findings show that for more than 35% of queries, there are no clicks on any result. More than 80% of web queries are informational in nature and approximately 10% are transactional, and 10% navigational. Sponsored links account for approximately 15% of all clicks. Integrating sponsored and nonsponsored links does not appear to increase the clicks on sponsored listings. We discuss how these research results could enhance future sponsored search platforms.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
This paper reports results from a study in which we automatically classified the query reformulation patterns for 964,780 Web searching sessions (composed of 1,523,072 queries) in order to predict what the next query reformulation would be. We employed an n-gram modeling approach to describe the probability of searchers transitioning from one query reformulation state to another and predict their next state. We developed first, second, third, and fourth order models and evaluated each model for accuracy of prediction. Findings show that Reformulation and Assistance account for approximately 45 percent of all query reformulations. Searchers seem to seek system searching assistant early in the session or after a content change. The results of our evaluations show that the first and second order models provided the best predictability, between 28 and 40 percent overall, and higher than 70 percent for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance in real time.
Resumo:
Anecdotal evidence highlights issues of alcohol and other drugs (AODs) and its association with safety risk on construction sites. Information is limited however regarding the prevalence of AODs in the workplace and there is limited evidential guidance regarding how to effectively address it. This research aimed to scientifically evaluate the use of AODs within the Australian construction industry in order to reduce the potential resulting safety and performance impacts and engender a cultural change in the workforce. A national qualitative and quantitative evaluation of the use of AODs was conducted with approximately 500 employees. Results indicate that as in the general population, a proportion of those sampled in the construction sector may be at risk of hazardous alcohol consumption and support the need for evidence-based, tailored responses. This is the first known study to scientifically evaluate the use of AODs and potential workplace safety impacts in the construction sector.
Resumo:
The purpose of this study was to identify the pedagogical knowledge relevant to the successful completion of a pie chart item. This purpose was achieved through the identification of the essential fluencies that 12–13-year-olds required for the successful solution of a pie chart item. Fluency relates to ease of solution and is particularly important in mathematics because it impacts on performance. Although the majority of students were successful on this multiple choice item, there was considerable divergence in the strategies they employed. Approximately two-thirds of the students employed efficient multiplicative strategies, which recognised and capitalised on the pie chart as a proportional representation. In contrast, the remaining one-third of students used a less efficient additive strategy that failed to capitalise on the representation of the pie chart. The results of our investigation of students’ performance on the pie chart item during individual interviews revealed that five distinct fluencies were involved in the solution process: conceptual (understanding the question), linguistic (keywords), retrieval (strategy selection), perceptual (orientation of a segment of the pie chart) and graphical (recognising the pie chart as a proportional representation). In addition, some students exhibited mild disfluencies corresponding to the five fluencies identified above. Three major outcomes emerged from the study. First, a model of knowledge of content and students for pie charts was developed. This model can be used to inform instruction about the pie chart and guide strategic support for students. Second, perceptual and graphical fluency were identified as two aspects of the curriculum, which should receive a greater emphasis in the primary years, due to their importance in interpreting pie charts. Finally, a working definition of fluency in mathematics was derived from students’ responses to the pie chart item.
Resumo:
The molecular and metal profile fingerprints were obtained from a complex substance, Atractylis chinensis DC—a traditional Chinese medicine (TCM), with the use of the high performance liquid chromatography (HPLC) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) techniques. This substance was used in this work as an example of a complex biological material, which has found application as a TCM. Such TCM samples are traditionally processed by the Bran, Cut, Fried and Swill methods, and were collected from five provinces in China. The data matrices obtained from the two types of analysis produced two principal component biplots, which showed that the HPLC fingerprint data were discriminated on the basis of the methods for processing the raw TCM, while the metal analysis grouped according to the geographical origin. When the two data matrices were combined into a one two-way matrix, the resulting biplot showed a clear separation on the basis of the HPLC fingerprints. Importantly, within each different grouping the objects separated according to their geographical origin, and they ranked approximately in the same order in each group. This result suggested that by using such an approach, it is possible to derive improved characterisation of the complex TCM materials on the basis of the two kinds of analytical data. In addition, two supervised pattern recognition methods, K-nearest neighbors (KNNs) method, and linear discriminant analysis (LDA), were successfully applied to the individual data matrices—thus, supporting the PCA approach.
Resumo:
In December 2007, random roadside drug testing commenced in Queensland, Australia. Subsequently, the aim of this study was to explore the preliminary impact of Queensland’s drug driving legislation and enforcement techniques by applying Stafford and Warr’s [Stafford, M. C., & Warr, M. (1993). A reconceptualization of general and specific deterrence. Journal of Research in Crime and Delinquency, 30, 123-135] reconceptualization of deterrence theory. Completing a comprehensive drug driving questionnaire were 899 members of the public, university students, and individuals referred to a drug diversion program. Of note was that approximately a fifth of participants reported drug driving in the past six months. Additionally, the analysis indicated that punishment avoidance and vicarious punishment avoidance were predictors of the propensity to drug drive in the future. In contrast, there were indications that knowing of others apprehended for drug driving was not a sufficient deterrent. Sustained testing and publicity of the legislation and countermeasure appears needed to increase the deterrent impact for drug driving.
Resumo:
The paper analyses the expected value of OD volumes from probe with fixed error, error that is proportional to zone size and inversely proportional to zone size. To add realism to the analysis, real trip ODs in the Tokyo Metropolitan Region are synthesised. The results show that for small zone coding with average radius of 1.1km, and fixed measurement error of 100m, an accuracy of 70% can be expected. The equivalent accuracy for medium zone coding with average radius of 5km would translate into a fixed error of approximately 300m. As expected small zone coding is more sensitive than medium zone coding as the chances of the probe error envelope falling into adjacent zones are higher. For the same error radii, error proportional to zone size would deliver higher level of accuracy. As over half (54.8%) of the trip ends start or end at zone with equivalent radius of ≤ 1.2 km and only 13% of trips ends occurred at zones with equivalent radius ≥2.5km, measurement error that is proportional to zone size such as mobile phone would deliver higher level of accuracy. The synthesis of real OD with different probe error characteristics have shown that expected value of >85% is difficult to achieve for small zone coding with average radius of 1.1km. For most transport applications, OD matrix at medium zone coding is sufficient for transport management. From this study it can be drawn that GPS with error range between 2 and 5m, and at medium zone coding (average radius of 5km) would provide OD estimates greater than 90% of the expected value. However, for a typical mobile phone operating error range at medium zone coding the expected value would be lower than 85%. This paper assumes transmission of one origin and one destination positions from the probe. However, if multiple positions within the origin and destination zones are transmitted, map matching to transport network could be performed and it would greatly improve the accuracy of the probe data.
Resumo:
Vehicle detectors have been installed at approximately every 300 meters on each lane on Tokyo metropolitan expressway. Various traffic data such as traffic volume, average speed and time occupancy are collected by vehicle detectors. We can understand traffic characteristics of every point by comparing traffic data collected at consecutive points. In this study, we focused on average speed, analyzed road potential by operating speed during free-flow conditions, and identified latent bottlenecks. Furthermore, we analyzed effects for road potential by the rainfall level and day of the week. It’s expected that this method of analysis will be utilized for installation of ITS such as drive assist, estimation of parameters for traffic simulation and feedback to road design as congestion measures.
Resumo:
When I arrived in Queensland's capital in 1996, Brisbane was commonly referred to as an 'overgrown country town'. This might have been an acceptable description in the 1990s, but it cannot be applied any longer. Brisbane, affectionaly referred to by the locals as Bris-Vegas, has now come of age. Following Sydney and Melbourne, Brisbane is the third most populous city in Australia with a population of approximately two million. Interestingly, the 2006 Census showed that 22 per cent of Brisbane's population was born overseas, the three main countries of birth being the UK, New Zealand and South Africa. Brisbane City is centred on its most dominant environmental element, the Brisbane River, which effectively carves Brisbane into two areas - the Northside and the Southside. The 2001 addition of Cox Rayner's Goodwill Pedestrian and Cycle Bridge signified Brisbane's acceptance and affectionate embrace of its River resulting in a long overdue linage between Brisbane's North and South. It connects the City's key precincts - the Northside CBD through Queensland University of Technology (QUT), across Brisbane River, to the recreational precinct of the Southside Southbank Parklands. The Southside cultural precinct of Southbank is the home to Queensland's Art Gallery, Performing Arts Complex, State Library and Museum -each of which were designed by Brisbane Stalwart Architect Robin Gibson, in the 1970s and '80s. The CBD component of the Brisbane River is flanked by a number of Institutional Facilities, including the campuses of QUT, Griffith University and the Southbank Education and Training Precinct (SETP), which combine to form a cross-river educational precinct. The past decade has born witness to a city which has keenly supported emerging architects in addition to the more entrenched stalwarts of the profession, resulting in a youthful, relaxed and unpretentious sub-tropical city. Viva Bris-Vegas!
Resumo:
A set of five tasks was designed to examine dynamic aspects of visual attention: selective attention to color, selective attention to pattern, dividing and switching attention between color and pattern, and selective attention to pattern with changing target. These varieties of visual attention were examined using the same set of stimuli under different instruction sets; thus differences between tasks cannot be attributed to differences in the perceptual features of the stimuli. ERP data are presented for each of these tasks. A within-task analysis of different stimulus types varying in similarity to the attended target feature revealed that an early frontal selection positivity (FSP) was evident in selective attention tasks, regardless of whether color was the attended feature. The scalp distribution of a later posterior selection negativity (SN) was affected by whether the attended feature was color or pattern. The SN was largely unaffected by dividing attention across color and pattern. A large widespread positivity was evident in most conditions, consisting of at least three subcomponents which were differentially affected by the attention conditions. These findings are discussed in relation to prior research and the time course of visual attention processes in the brain.
Resumo:
Background: Pseudomonas aeruginosa is the most common bacterial pathogen in cystic fibrosis (CF) patients. Current infection control guidelines aim to prevent transmission via contact and respiratory droplet routes and do not consider the possibility of airborne transmission. We hypothesized that with coughing, CF subjects produce viable, respirable bacterial aerosols. Methods: Cross-sectional study of 15 children and 13 adults with CF, 26 chronically infected with P. aeruginosa. A cough aerosol sampling system enabled fractioning of respiratory particles of different size, and culture of viable Gram negative non-fermentative bacteria. We collected cough aerosols during 5 minutes voluntary coughing and during a sputum induction procedure when tolerated. Standardized quantitative culture and genotyping techniques were used. Results: P. aeruginosa was isolated in cough aerosols of 25 (89%) subjects of whom 22 produced sputum samples. P. aeruginosa from sputum and paired cough aerosols were indistinguishable by molecular typing. In 4 cases the same genotype was isolated from ambient room air. Approximately 70% of viable aerosols collected during voluntary coughing were of particles ≤ 3.3 microns aerodynamic diameter. P. aeruginosa, Burkholderia cenocepacia Stenotrophomonas maltophilia and Achromobacter xylosoxidans were cultivated from respiratory particles in this size range. Positive room air samples were associated with high total counts in cough aerosols (P=0.003). The magnitude of cough aerosols were associated with higher FEV1 (r=0.45, P=0.02) and higher quantitative sputum culture results (r=0.58, P=0.008). Conclusion: During coughing, CF patients produce viable aerosols of P. aeruginosa and other Gram negative bacteria of respirable size range, suggesting the potential for airborne transmission.