236 resultados para Initial values


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Postural instability is one of the major complications found in stroke survivors. Parameterising the functional reach test (FRT) could be useful in clinical practice and basic research. OBJECTIVES: To analyse the reliability, sensitivity, and specificity in the FRT parameterisation using inertial sensors for recording kinematic variables in patients who have suffered a stroke. DESIGN: Cross-sectional study. While performing FRT, two inertial sensors were placed on the patient's back (lumbar and trunk). PARTICIPANTS: Five subjects over 65 who suffer from a stroke. MEASUREMENTS: FRT measures, lumbosacral/thoracic maximum angular displacement, maximum time of lumbosacral/thoracic angular displacement, time return initial position, and total time. Speed and acceleration of the movements were calculated indirectly. RESULTS: FRT measure is  12.75±2.06 cm. Intrasubject reliability values range from 0.829 (time to return initial position (lumbar sensor)) to 0.891 (lumbosacral maximum angular displacement). Intersubject reliability values range from 0.821 (time to return initial position (lumbar sensor)) to 0.883 (lumbosacral maximum angular displacement). FRT's reliability was 0.987 (0.983-0.992) and 0.983 (0.979-0.989) intersubject and intrasubject, respectively. CONCLUSION: The main conclusion could be that the inertial sensors are a tool with excellent reliability and validity in the parameterization of the FRT in people who have had a stroke.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND This paper describes the first national burden of disease study for South Africa. The main focus is the burden due to premature mortality, i.e. years of life lost (YLLs). In addition, estimates of the burden contributed by morbidity, i.e. the years lived with disability (YLDs), are obtained to calculate disability-adjusted life years (DALYs); and the impact of AIDS on premature mortality in the year 2010 is assessed. METHOD Owing to the rapid mortality transition and the lack of timely data, a modelling approach has been adopted. The total mortality for the year 2000 is estimated using a demographic and AIDS model. The non-AIDS cause-of-death profile is estimated using three sources of data: Statistics South Africa, the National Department of Home Affairs, and the National Injury Mortality Surveillance System. A ratio method is used to estimate the YLDs from the YLL estimates. RESULTS The top single cause of mortality burden was HIV/AIDS followed by homicide, tuberculosis, road traffic accidents and diarrhoea. HIV/AIDS accounted for 38% of total YLLs, which is proportionately higher for females (47%) than for males (33%). Pre-transitional diseases, usually associated with poverty and underdevelopment, accounted for 25%, non-communicable diseases 21% and injuries 16% of YLLs. The DALY estimates highlight the fact that mortality alone underestimates the burden of disease, especially with regard to unintentional injuries, respiratory disease, and nervous system, mental and sense organ disorders. The impact of HIV/AIDS is expected to more than double the burden of premature mortality by the year 2010. CONCLUSION This study has drawn together data from a range of sources to develop coherent estimates of premature mortality by cause. South Africa is experiencing a quadruple burden of disease comprising the pre-transitional diseases, the emerging chronic diseases, injuries, and HIV/AIDS. Unless interventions that reduce morbidity and delay morbidity become widely available, the burden due to HIV/AIDS can be expected to grow very rapidly in the next few years. An improved base of information is needed to assess the morbidity impact more accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is increased accountability of initial teacher education (ITE) programs in Australia to develop Graduate teachers who are better prepared. Most ITE programs have been designed using Pedagogical Content Knowledge. Informed by the growing Technological Pedagogical Content Knowledge (TPACK) research, this journal article suggests that ITE programs need to develop Graduate teachers who have the TPACK capabilities to use technologies to support teaching and student learning. Insights from the research and evaluation of the Teaching Teachers for the Future (TTF) Project, which was guided by the TPACK conceptualisation, are provided. The TTF Project, which involved all Higher Education Institutions providing ITE programs in Australia, drew upon the TPACK conceptualisation. The TTF Project research and evaluation included the development and administration of a TTF TPACK Survey and the implementation of the Most Significant Change Methodology. Key findings resulting from the employment of these methodologies are summarised to provide guidance to inform the improvement of ITE programs to develop Graduate TPACK capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The signal-to-noise ratio achievable in x-ray computed tomography (CT) images of polymer gels can be increased by averaging over multiple scans of each sample. However, repeated scanning delivers a small additional dose to the gel which may compromise the accuracy of the dose measurement. In this study, a NIPAM-based polymer gel was irradiated and then CT scanned 25 times, with the resulting data used to derive an averaged image and a "zero-scan" image of the gel. Comparison between these two results and the first scan of the gel showed that the averaged and zero-scan images provided better contrast, higher contrast-to- noise and higher signal-to-noise than the initial scan. The pixel values (Hounsfield units, HU) in the averaged image were not noticeably elevated, compared to the zero-scan result and the gradients used in the linear extrapolation of the zero-scan images were small and symmetrically distributed around zero. These results indicate that the averaged image was not artificially lightened by the small, additional dose delivered during CT scanning. This work demonstrates the broader usefulness of the zero-scan method as a means to verify the dosimetric accuracy of gel images derived from averaged x-ray CT data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Measurement of the global burden of disease with disability-adjusted life-years (DALYs) requires disability weights that quantify health losses for all non-fatal consequences of disease and injury. There has been extensive debate about a range of conceptual and methodological issues concerning the definition and measurement of these weights. Our primary objective was a comprehensive re-estimation of disability weights for the Global Burden of Disease Study 2010 through a large-scale empirical investigation in which judgments about health losses associated with many causes of disease and injury were elicited from the general public in diverse communities through a new, standardised approach. METHODS We surveyed respondents in two ways: household surveys of adults aged 18 years or older (face-to-face interviews in Bangladesh, Indonesia, Peru, and Tanzania; telephone interviews in the USA) between Oct 28, 2009, and June 23, 2010; and an open-access web-based survey between July 26, 2010, and May 16, 2011. The surveys used paired comparison questions, in which respondents considered two hypothetical individuals with different, randomly selected health states and indicated which person they regarded as healthier. The web survey added questions about population health equivalence, which compared the overall health benefits of different life-saving or disease-prevention programmes. We analysed paired comparison responses with probit regression analysis on all 220 unique states in the study. We used results from the population health equivalence responses to anchor the results from the paired comparisons on the disability weight scale from 0 (implying no loss of health) to 1 (implying a health loss equivalent to death). Additionally, we compared new disability weights with those used in WHO's most recent update of the Global Burden of Disease Study for 2004. FINDINGS 13,902 individuals participated in household surveys and 16,328 in the web survey. Analysis of paired comparison responses indicated a high degree of consistency across surveys: correlations between individual survey results and results from analysis of the pooled dataset were 0·9 or higher in all surveys except in Bangladesh (r=0·75). Most of the 220 disability weights were located on the mild end of the severity scale, with 58 (26%) having weights below 0·05. Five (11%) states had weights below 0·01, such as mild anaemia, mild hearing or vision loss, and secondary infertility. The health states with the highest disability weights were acute schizophrenia (0·76) and severe multiple sclerosis (0·71). We identified a broad pattern of agreement between the old and new weights (r=0·70), particularly in the moderate-to-severe range. However, in the mild range below 0·2, many states had significantly lower weights in our study than previously. INTERPRETATION This study represents the most extensive empirical effort as yet to measure disability weights. By contrast with the popular hypothesis that disability assessments vary widely across samples with different cultural environments, we have reported strong evidence of highly consistent results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Engineering is a problem-based practically oriented discipline, whose practitioners aim to find effective solutions to engineering challenges, technically and economically. Engineering educators operate within a mandate to ensure that graduate engineers understand the practicalities and realities of good engineering practice. While this is a vital goal for the discipline, emerging influences are challenging the focus on ‘hard practicalities’ and requiring recognition of the cultural and social aspects of engineering. Expecting graduate engineers to possess communication skills essential for negotiating satisfactory outcomes in contexts of complex social beliefs about the impact of their work can be an unsettling and challenging prospect for engineering educators. This project identifies and addresses Indigenous engineering practices and principles, and their relevance to future engineering practices. PURPOSE This Office of Learning and Teaching (OLT) project proposes that what is known/discoverable about indigenous engineering knowledge and practices must be integrated into engineering curricula. This is an important aspect of ensuring that engineering as a profession responds competently to increasing demands for socially and environmentally responsible activity across all aspects of engineering activity. DESIGN/METHOD The project addresses i) means for appropriate inclusion of Indigenous students into usual teaching activities ii) assuring engineering educators have access to knowledge of Indigenous practices and skills relevant to particular engineering courses and topics iii) means for preparing all students to negotiate their way through issues of indigenous relationships with the land where engineering projects are planned. The project is undertaking wide-ranging research to collate knowledge about indigenous engineering principles and practices and develop relevant resource materials. RESULTS It is common to hear that such social issues as ‘Indigenous concerns’ are only of concern to environmental engineers. We challenge that perspective, and make the case that Indigenous knowledge is an important issue for all engineering educators in relation to effective integration of indigenous students and preparation of all engineering graduates to engage with indigenous communities. At the time of first contact, a rich and varied, technically literate, Indigenous social framework possessed knowledge of the environment that is not yet fully acknowledged in Australian society. A core outcome of the work will be development of resources relating to Indigenous engineering practices for inclusion in engineering core curricula. CONCLUSIONS A large body of technical knowledge was needed to survive and sustain human society in the complex environment that was Australia before 1788. This project is developing resource materials, and supporting documentation, about that knowledge to enable engineering educators to more easily integrate it into current curricula. The project also aims to demonstrate the importance for graduating engineers to appreciate the existence of diverse perspectives on engineering tasks and learn how to value - and employ - multiple paths to possible solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-use values (i.e. economic values assigned by individuals to ecosystem goods and services unrelated to current or future uses) provide one of the most compelling incentives for the preservation of ecosystems and biodiversity. Assessing the non-use values of non-users is relatively straightforward using stated preference methods, but the standard approaches for estimating non-use values of users (stated decomposition) have substantial shortcomings which undermine the robustness of their results. In this paper, we propose a pragmatic interpretation of non-use values to derive estimates that capture their main dimensions, based on the identification of a willingness to pay for ecosystem protection beyond one's expected life. We empirically test our approach using a choice experiment conducted on coral reef ecosystem protection in two coastal areas in New Caledonia with different institutional, cultural, environmental and socio-economic contexts. We compute individual willingness to pay estimates, and derive individual non-use value estimates using our interpretation. We find that, a minima, estimates of non-use values may comprise between 25 and 40% of the mean willingness to pay for ecosystem preservation, less than has been found in most studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Office of Urban Management recognises that the values which characterise the SEQ region as 'subtropical' are important determinants of form in urban and regional planning. Subtropical values are those qualities on which our regional identity depends. A built environment which responds positively to these values is a critical ingredient for achieving a desirable future for the region. The Centre for Subtropical Design has undertaken this study to identify the particular set of values which characterises SEQ, and to translate theses values into design principals that will maintain and reinforce the value set. The principles not only apply to the overall balance between the natural environment and the built environment, but can be applied by local government authorities to guide local planning schemes and help shape specific built for outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to determine the impact stigma has on property values and how long the stigma remains after the Not in My Back Yard (NIMBY) structure has been removed. Design/methodology/approach - A quantitative analysis was undertaken, using a high voltage overhead transmission line (HVOTL) case study, to determine the effect on property values prior and post removal of the NIMBY structure. A repeat sales index in conjunction with the regression analysis determined the length of time, the stigma remained after removal of the NIMBY structure. Findings - The results show that while the NIMBY is in place the impact on value is confined to those properties in close proximity. This is in contradiction to the findings, where on removal of the NIMBY the property values of the whole neighbourhood improve with the stigma remaining for 3 to 4 years. Research Implications - The implication of this research is that property Valuers need to change the way they take into account the presence of NIMBYs when valuing property with more emphasis, being placed on the neighbourhood rather than just the properties in close proximity. While the HVOTL was in place, only properties in close proximity were negatively affected, but on removal of the HVOTL the whole neighbourhood increased in value. Originality/value - Results expand on current knowledge by demonstrating the length of time the market takes to adjust to the removal of a NIMBY structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the complexity of calcium ion exchange with sodium exchanged weak acid cation resin (DOW MAC-3). Exchange equilibria recorded for a range of different solution normalities revealed profiles which were represented by conventional “L” or “H” type isotherms at low values of equilibrium concentration (Ce) of calcium ions, plus a superimposed region of increasing calcium uptake was observed at high Ce values. The loading of calcium ions was determined to be ca. 53.5 to 58.7 g/kg of resin when modelling only the sorption curve created at low Ce values,which exhibited a well-defined plateau. The calculated calcium ion loading capacity for DOWMAC-3 resin appeared to correlate with the manufacturer's recommendation. The phenomenon of super equivalent ion exchange (SEIX) was observed when the “driving force” for the exchange process was increased in excess of 2.25 mmol calcium ions per gram of resin in the starting solution. This latter event was explained in terms of displacement of sodium ions from sodium hydroxide solution which remained in the resin bead following the initial conversion of the as supplied “H+” exchanged resin sites to the “Na+” version required for softening studies. Evidence for hydrolysis of a small fraction of the sites on the sodium exchanged resin surface was noted. The importance of carefully choosing experimental parameters was discussed especially in relation to application of the Langmuir–Vageler expression. This latter model which compared the ratio of the initial calcium ion concentration in solution to resin mass, versus final equilibrium loading of the calcium ions on the resin; was discovered to be an excellent means of identifying the progress of the calcium–sodium ion exchange process. Moreover, the Langmuir–Vageler model facilitated standardization of various calcium–sodium ion exchange experiments which allowed systematic experimental design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constructed wetlands are among the most common Water Sensitive Urban Design (WSUD) measures for stormwater treatment. These systems have been extensively studied to understand their performance and influential treatment processes. Unfortunately, most past studies have been undertaken considering a wetland system as a lumped system with a primary focus on the reduction of the event mean concentration (EMC) values of specific pollutant species or total pollutant load removal. This research study adopted an innovative approach by partitioning the inflow runoff hydrograph and then investigating treatment performance in each partition and their relationships with a range of hydraulic factors. The study outcomes confirmed that influenced by rainfall characteristics, the constructed wetland displays different treatment characteristics for the initial and later sectors of the runoff hydrograph. The treatment of small rainfall events (<15 mm) is comparatively better at the beginning of runoff events while the trends in pollutant load reductions for large rainfall events (>15 mm) are generally lower at the beginning and gradually increase towards the end of rainfall events. This highlights the importance of ensuring that the inflow into a constructed wetland has low turbulence in order to achieve consistent treatment performance for both, small and large rainfall events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2012, the Australian Council of Deans of Education (ACDE), through the Queensland University of Technology, led a MATSITI project focusing on issues related to the retention, support and graduation of Aboriginal and Torres Strait Islander teachers in initial Teacher Education programs across Australia. While some of the barriers that impact on the graduation of Aboriginal and Torres Strait Islander teachers are well, known, this was the first large-scale Australian study to look at the issues nationally and in depth. Thirty-four Teacher Education programs across the country were audited, meetings were held in each state, both Aboriginal and Torres Strait Islander and non-Indigenous Faculty were consulted and approximately 70 Aboriginal and Torres Strait Islander pre-service teachers interviewed. This paper reports on the outcomes of that project, including the evidence that while recruitment into Teacher Education has, in some sites, reached parity, retention rates are well-below expected across the nation. The paper focuses both on the quantitative data and, even more significantly, on the voices of the pre-service teachers themselves, offering insights into the ways forward. As a result of this study, Deans and Heads of School of Teacher Education programs across the country have developed Action Plans alongside their university's Indigenous Higher Education Centres to improve support and retention of Aboriginal and Torres Strait Islander teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new way to build a combined list from K base lists, each containing N items. A combined list consists of top segments of various sizes from each base list so that the total size of all top segments equals N. A sequence of item requests is processed and the goal is to minimize the total number of misses. That is, we seek to build a combined list that contains all the frequently requested items. We first consider the special case of disjoint base lists. There, we design an efficient algorithm that computes the best combined list for a given sequence of requests. In addition, we develop a randomized online algorithm whose expected number of misses is close to that of the best combined list chosen in hindsight. We prove lower bounds that show that the expected number of misses of our randomized algorithm is close to the optimum. In the presence of duplicate items, we show that computing the best combined list is NP-hard. We show that our algorithms still apply to a linearized notion of loss in this case. We expect that this new way of aggregating lists will find many ranking applications.