867 resultados para Paper and cardboard
Resumo:
Objectives: To validate the WOMAC 3.1 in a touch screen computer format, which applies each question as a cartoon in writing and in speech (QUALITOUCH method), and to assess patient acceptance of the computer touch screen version. Methods: The paper and computer formats of WOMAC 3.1 were applied in random order to 53 subjects with hip or knee osteoarthritis. The mean age of the subjects was 64 years ( range 45 to 83), 60% were male, 53% were 65 years or older, and 53% used computers at home or at work. Agreement between formats was assessed by intraclass correlation coefficients (ICCs). Preferences were assessed with a supplementary questionnaire. Results: ICCs between formats were 0.92 (95% confidence interval, 0.87 to 0.96) for pain; 0.94 (0.90 to 0.97) for stiffness, and 0.96 ( 0.94 to 0.98) for function. ICCs were similar in men and women, in subjects with or without previous computer experience, and in subjects below or above age 65. The computer format was found easier to use by 26% of the subjects, the paper format by 8%, and 66% were undecided. Overall, 53% of subjects preferred the computer format, while 9% preferred the paper format, and 38% were undecided. Conclusion: The computer format of the WOMAC 3.1 is a reliable assessment tool. Agreement between computer and paper formats was independent of computer experience, age, or sex. Thus the computer format may help improve patient follow up by meeting patients' preferences and providing immediate results.
Resumo:
To study the media messages portrayed to children, 925 students, from 9 to up to 14 years of age, completed “The Sociocultural Influences Questionnaire.” The media section is the focus of this paper, and the responses from three questions were selected to examine the media's influence to be slimmer, increase weight, or increase muscle size. While the girls and boys exhibited different levels of agreement with each media influence, both genders disagreed that media messages were implying they should gain weight. This is in agreement with the belief that the media perpetuates the ideal of thinness and there is a negative stigma associated with being overweight.
Resumo:
The wide-line H-1 nuclear magnetic resonance (NMR) spectrum of paper in equilibrium with ambient humidity consists of super-imposed relatively broad and narrow lines. The narrower line is of the order of 2 kHz wide at half the maximum height, while the broader line is of the order of 40 kHz in width at half height. On the basis of these line widths, the narrow line is assigned to water sorbed to the paper, and the broad line to the polymeric constituents of the paper. It was not possible to distinguish between the various polymeric components of paper contributing to the H-1 NMR spectra. A modified Goldman-Shen pulse sequence was used to generate a spatial magnetisation gradient between the polymer and water phases. The exchange of magnetisation between protons associated with water and those associated with the macromolecules in paper was observed. The exchange of magnetisation is discussed within a heat transfer model for homonuclear dipolar coupling, with exchange being characterised by a spin-diffusion coefficient. Consideration of the magnitude of the initial rate of the exchange process and estimates of the spin-spin relaxation times based on H-1 line widths indicate that some water must exist in a sufficiently immobile state as to allow homonuclear dipolar interactions between adjacent polymer and water protons. Thus, water sorbed onto paper must exist in at least two states in mass exchange with each other. This observation allows certain conclusions to be drawn about the ratio of free/bound water as a function of moisture content and the dispersal of water within the polymer matrix.
Resumo:
Due to complex field/tissue interactions, high-field magnetic resonance (MR) images suffer significant image distortions that result in compromised diagnostic quality. A new method that attempts to remove these distortions is proposed in this paper and is based on the use of transceiver-phased arrays. The proposed system uses, in the examples presented herein, a shielded four-element transceive-phased array head coil and involves performing two separate scans of the same slice with each scan using different excitations during transmission. By optimizing the amplitudes and phases for each scan, antipodal signal profiles can be obtained, and by combining both the images together, the image distortion can be reduced several fold. A combined hybrid method of moments (MoM)/finite element method (FEM) and finite-difference time-domain (FDTD) technique is proposed and used to elucidate the concept of the new method and to accurately evaluate the electromagnetic field (EMF) in a human head model. In addition, the proposed method is used in conjunction with the generalized auto-calibrating partially parallel acquisitions (GRAPPA) reconstruction technique to enable rapid imaging of the two scans. Simulation results reported herein for 11-T (470-MHz) brain imaging applications show that the new method with GRAPPA reconstruction theoretically results in improved image quality and that the proposed combined hybrid MoM/FEM and FDTD technique is. suitable for high-field magnetic resonance imaging (MRI) numerical analysis.
Resumo:
This rejoinder reflects an important step, for me, in a preoccupation with methodology that has provided me with many hours of enjoyable reading, not to mention anxiety. For me the ‘reality’ of the incommensurable nature of paradigms and acceptance of the legitimacy of a range of conceptual and philosophical traditions came late. As a constructionist I find myself on the ‘anything goes’ end of methodology choice. This paper and my main paper ought not to be read as a critique of ‘middle range’ theory, but as a critique of an important and necessary aspect of the way we all seek to inscribe facts and structure our writing. What follows is a reflection of the influence Bruno Latour’s writings have had on my ways of seeing and perhaps an unhealthy emphasis on the small things that combine to produce convincing arguments and ‘facts’.
Resumo:
This dissertation documents the everyday lives and spaces of a population of youth typically constructed as out of place, and the broader urban context in which they are rendered as such. Thirty-three female and transgender street youth participated in the development of this youth-based participatory action research (YPAR) project utilizing geo-ethnographic methods, auto-photography, and archival research throughout a six-phase, eighteen-month research process in Bogotá, Colombia. ^ This dissertation details the participatory writing process that enabled the YPAR research team to destabilize dominant representations of both street girls and urban space and the participatory mapping process that enabled the development of a youth vision of the city through cartographic images. The maps display individual and aggregate spatial data indicating trends within and making comparisons between three subgroups of the research population according to nine spatial variables. These spatial data, coupled with photographic and ethnographic data, substantiate that street girls’ mobilities and activity spaces intersect with and are altered by state-sponsored urban renewal projects and paramilitary-led social cleansing killings, both efforts to clean up Bogotá by purging the city center of deviant populations and places. ^ Advancing an ethical approach to conducting research with excluded populations, this dissertation argues for the enactment of critical field praxis and care ethics within a YPAR framework to incorporate young people as principal research actors rather than merely voices represented in adultist academic discourse. Interjection of considerations of space, gender, and participation into the study of street youth produce new ways of envisioning the city and the role of young people in research. Instead of seeing the city from a panoptic view, Bogotá is revealed through the eyes of street youth who participated in the construction and feminist visualization of a new cartography and counter-map of the city grounded in embodied, situated praxis. This dissertation presents a socially responsible approach to conducting action-research with high-risk youth by documenting how street girls reclaim their right to the city on paper and in practice; through maps of their everyday exclusion in Bogotá followed by activism to fight against it.^
Resumo:
Financial innovations have emerged globally to close the gap between the rising global demand for infrastructures and the availability of financing sources offered by traditional financing mechanisms such as fuel taxation, tax-exempt bonds, and federal and state funds. The key to sustainable innovative financing mechanisms is effective policymaking. This paper discusses the theoretical framework of a research study whose objective is to structurally and systemically assess financial innovations in global infrastructures. The research aims to create analysis frameworks, taxonomies and constructs, and simulation models pertaining to the dynamics of the innovation process to be used in policy analysis. Structural assessment of innovative financing focuses on the typologies and loci of innovations and evaluates the performance of different types of innovative financing mechanisms. Systemic analysis of innovative financing explores the determinants of the innovation process using the System of Innovation approach. The final deliverables of the research include propositions pertaining to the constituents of System of Innovation for infrastructure finance which include the players, institutions, activities, and networks. These static constructs are used to develop a hybrid Agent-Based/System Dynamics simulation model to derive propositions regarding the emergent dynamics of the system. The initial outcomes of the research study are presented in this paper and include: (a) an archetype for mapping innovative financing mechanisms, (b) a System of Systems-based analysis framework to identify the dimensions of Systems of Innovation analyses, and (c) initial observations regarding the players, institutions, activities, and networks of the System of Innovation in the context of the U.S. transportation infrastructure financing.
Resumo:
The purpose of this study was to evaluate the impact of the ropes course on the self-esteem of undergraduate and graduate students. The ropes course provides challenging experiential activities that facilitate personal confidence and group teamwork. The study relates to adult education, experiential education, and program evaluation within the context of hospitality and tourism management education. Quantitative data were based on the assessment of self-esteem through the completion of the Coopersmith Self-Esteem Inventory (SEI-C) modified for this study. Qualitative data were based on assessment of participants' reflective papers stating their perceptions of the ropes course. ^ The study compared a treatment group (31 undergraduate and 25 graduate students) which participated in the ropes course, and a control group (31 undergraduate and 25 graduate students) which did not participate. Both groups completed the pre- and post-treatment SEI-C at the same time intervals. The quantitative data were analyzed using a two-way repeated-measures analysis of variance (ANOVA). The qualitative data, comprised of reflective papers voluntarily written by 44 (79%) of the treatment group, were coded using four major themes reflecting the students' perceptions about the ropes course. ^ Scores on the pretest and posttest of the SEI-C were not significantly different for the two groups. The qualitative data showed a favorable impact of the ropes course. The discrepancy in the outcomes based on these two measures suggests that the SEI-C self-report paper-and-pencil instrument may not be sufficiently refined to evaluate the complex issue of self-esteem. The SEI-C, if used, should be supplemented by other evaluation measures, since the two measures may be evaluating different components of self-esteem. They may be also differentially affected by bias in scoring, or by statistical characteristics of reliability and validity. ^
Resumo:
Léon Walras (1874) already had realized that his neo-classical general equilibrium model could not accommodate autonomous investment. Sen analysed the same issue in a simple, one-sector macroeconomic model of a closed economy. He showed that fixing investment in the model, built strictly on neo-classical assumptions, would make the system overdetermined, thus, one should loosen some neo-classical condition of competitive equilibrium. He analysed three not neo-classical “closure options”, which could make the model well determined in the case of fixed investment. Others later extended his list and it showed that the closure dilemma arises in the more complex computable general equilibrium (CGE) models as well, as does the choice of adjustment mechanism assumed to bring about equilibrium at the macro level. By means of numerical models, it was also illustrated that the adopted closure rule can significantly affect the results of policy simulations based on a CGE model. Despite these warnings, the issue of macro closure is often neglected in policy simulations. It is, therefore, worth revisiting the issue and demonstrating by further examples its importance, as well as pointing out that the closure problem in the CGE models extends well beyond the problem of how to incorporate autonomous investment into a CGE model. Several closure rules are discussed in this paper and their diverse outcomes are illustrated by numerical models calibrated on statistical data. First, the analyses is done in a one-sector model, similar to Sen’s, but extended into a model of an open economy. Next, the same analyses are repeated using a fully-fledged multisectoral CGE model, calibrated on the same statistical data. Comparing the results obtained by the two models it is shown that although, using the same closure option, they generate quite similar results in terms of the direction and – to a somewhat lesser extent – of the magnitude of change in the main macro variables, the predictions of the multi-sectoral CGE model are clearly more realistic and balanced.
Resumo:
Acknowledgements Although this paper is not linked to any of the research carried out by, or on behalf of, the James Hutton Institute, some parts of it were written during the first author’s allocated work time. Rachel Creaney is grateful to the James Hutton Institute for giving her this opportunity. The authors would also like to thank Dr Tavis Potts (University of Aberdeen) for proofreading the first draft of the paper and providing valuable comments on its flow, structure and contents. Finally, the authors are grateful to Emily Hastings and Doug Wardell-Johnson from the James Hutton Institute for their assistance with obtaining some of the data used in this paper.
Resumo:
Acknowledgments The authors would like to thank Shell E&P Rijswijk, for supporting this research. The authors are grateful to Pat Shannon, Catherine Baudon and Dominique Frizon de Lamotte for many discussions on rift processes. We would like to thank Steven Bergman for thorough comments on an early version of the paper, and Chris Morley and an anonymous reviewer for sharing ideas and references for writing a better paper
Resumo:
The dissertation consists of three chapters related to the low-price guarantee marketing strategy and energy efficiency analysis. The low-price guarantee is a marketing strategy in which firms promise to charge consumers the lowest price among their competitors. Chapter 1 addresses the research question "Does a Low-Price Guarantee Induce Lower Prices'' by looking into the retail gasoline industry in Quebec where there was a major branded firm which started a low-price guarantee back in 1996. Chapter 2 does a consumer welfare analysis of low-price guarantees to drive police indications and offers a new explanation of the firms' incentives to adopt a low-price guarantee. Chapter 3 develops the energy performance indicators (EPIs) to measure energy efficiency of the manufacturing plants in pulp, paper and paperboard industry.
Chapter 1 revisits the traditional view that a low-price guarantee results in higher prices by facilitating collusion. Using accurate market definitions and station-level data from the retail gasoline industry in Quebec, I conducted a descriptive analysis based on stations and price zones to compare the price and sales movement before and after the guarantee was adopted. I find that, contrary to the traditional view, the stores that offered the guarantee significantly decreased their prices and increased their sales. I also build a difference-in-difference model to quantify the decrease in posted price of the stores that offered the guarantee to be 0.7 cents per liter. While this change is significant, I do not find the response in comeptitors' prices to be significant. The sales of the stores that offered the guarantee increased significantly while the competitors' sales decreased significantly. However, the significance vanishes if I use the station clustered standard errors. Comparing my observations and the predictions of different theories of modeling low-price guarantees, I conclude the empirical evidence here supports that the low-price guarantee is a simple commitment device and induces lower prices.
Chapter 2 conducts a consumer welfare analysis of low-price guarantees to address the antitrust concerns and potential regulations from the government; explains the firms' potential incentives to adopt a low-price guarantee. Using station-level data from the retail gasoline industry in Quebec, I estimated consumers' demand of gasoline by a structural model with spatial competition incorporating the low-price guarantee as a commitment device, which allows firms to pre-commit to charge the lowest price among their competitors. The counterfactual analysis under the Bertrand competition setting shows that the stores that offered the guarantee attracted a lot more consumers and decreased their posted price by 0.6 cents per liter. Although the matching stores suffered a decrease in profits from gasoline sales, they are incentivized to adopt the low-price guarantee to attract more consumers to visit the store likely increasing profits at attached convenience stores. Firms have strong incentives to adopt a low-price guarantee on the product that their consumers are most price-sensitive about, while earning a profit from the products that are not covered in the guarantee. I estimate that consumers earn about 0.3% more surplus when the low-price guarantee is in place, which suggests that the authorities should not be concerned and regulate low-price guarantees. In Appendix B, I also propose an empirical model to look into how low-price guarantees would change consumer search behavior and whether consumer search plays an important role in estimating consumer surplus accurately.
Chapter 3, joint with Gale Boyd, describes work with the pulp, paper, and paperboard (PP&PB) industry to provide a plant-level indicator of energy efficiency for facilities that produce various types of paper products in the United States. Organizations that implement strategic energy management programs undertake a set of activities that, if carried out properly, have the potential to deliver sustained energy savings. Energy performance benchmarking is a key activity of strategic energy management and one way to enable companies to set energy efficiency targets for manufacturing facilities. The opportunity to assess plant energy performance through a comparison with similar plants in its industry is a highly desirable and strategic method of benchmarking for industrial energy managers. However, access to energy performance data for conducting industry benchmarking is usually unavailable to most industrial energy managers. The U.S. Environmental Protection Agency (EPA), through its ENERGY STAR program, seeks to overcome this barrier through the development of manufacturing sector-based plant energy performance indicators (EPIs) that encourage U.S. industries to use energy more efficiently. In the development of the energy performance indicator tools, consideration is given to the role that performance-based indicators play in motivating change; the steps necessary for indicator development, from interacting with an industry in securing adequate data for the indicator; and actual application and use of an indicator when complete. How indicators are employed in EPA’s efforts to encourage industries to voluntarily improve their use of energy is discussed as well. The chapter describes the data and statistical methods used to construct the EPI for plants within selected segments of the pulp, paper, and paperboard industry: specifically pulp mills and integrated paper & paperboard mills. The individual equations are presented, as are the instructions for using those equations as implemented in an associated Microsoft Excel-based spreadsheet tool.
Resumo:
Spectral albedo has been measured at Dome C since December 2012 in the visible and near infrared (400 - 1050 nm) at sub-hourly resolution using a home-made spectral radiometer. Superficial specific surface area (SSA) has been estimated by fitting the observed albedo spectra to the analytical Asymptotic Approximation Radiative Transfer theory (AART). The dataset includes fully-calibrated albedo and SSA that pass several quality checks as described in the companion article. Only data for solar zenith angles less than 75° have been included, which theoretically spans the period October-March. In addition, to correct for residual errors still affecting data after the calibration, especially at the solar zenith angles higher than 60°, we produced a higher quality albedo time-series as follows: In the SSA estimation process described in the companion paper, a scaling coefficient A between the observed albedo and the theoretical model predictions was introduced to cope with these errors. This coefficient thus provides a first order estimate of the residual error. By dividing the albedo by this coefficient, we produced the "scaled fully-calibrated albedo". We strongly recommend to use the latter for most applications because it generally remains in the physical range 0-1. The former albedo is provided for reference to the companion paper and because it does not depend on the SSA estimation process and its underlying assumptions.
Resumo:
This project on Policy Solutions and International Perspectives on the Funding of Public Service Media Content for Children began on 8 February 2016 and concludes on 31 May 2016. Its outcomes contribute to the policy-making process around BBC Charter Review, which has raised concerns about the financial sustainability of UK-produced children’s screen content. The aim of this project is to evaluate different funding possibilities for public service children’s content in a more challenging and competitive multiplatform media environment, drawing on experiences outside the UK. The project addresses the following questions: • What forms of alternative funding exist to support public service content for children in a transforming multiplatform media environment? • What can we learn from the types of funding and support for children’s screen content that are available elsewhere in the world – in terms of regulatory foundations, administration, accountability, levels of funding, amounts and types of content supported? • How effective are these funding systems and funding sources for supporting domestically produced content (range and numbers of projects supported; audience reach)? This stakeholder report constitutes the main outcome of the project and provides an overview and analysis of alternatives for supporting and funding home-grown children’s screen content across both traditional broadcasting outlets and emerging digital platforms. The report has been made publicly available, so that it can inform policy work and responses to the UK Government White Paper, A BBC for the Future, published by the Department of Culture, Media and Sport in May 2016.
Resumo:
Background: The impact of cancer upon children, teenagers and young people can be profound. Research has been undertaken to explore the impacts upon children, teenagers and young people with cancer, but little is known about how researchers can ‘best’ engage with this group to explore their experiences. This review paper provides an overview of the utility of data collection methods employed when undertaking research with children, teenagers and young people. A systematic review of relevant databases was undertaken utilising the search terms ‘young people’, ‘young adult’, ‘adolescent’ and ‘data collection methods’. The full-text of the papers that were deemed eligible from the title and abstract were accessed and following discussion within the research team, thirty papers were included. Findings: Due to the heterogeneity in terms of the scope of the papers identified the following data collections methods were included in the results section. Three of the papers identified provided an overview of data collection methods utilised with this population and the remaining twenty seven papers covered the following data collection methods: Digital technologies; art based research; comparing the use of ‘paper and pencil’ research with web-based technologies, the use of games; the use of a specific communication tool; questionnaires and interviews; focus groups and telephone interviews/questionnaires. The strengths and limitations of the range of data collection methods included are discussed drawing upon such issues as of the appropriateness of particular methods for particular age groups, or the most appropriate method to employ when exploring a particularly sensitive topic area. Conclusions: There are a number of data collection methods utilised to undertaken research with children, teenagers and young adults. This review provides a summary of the current available evidence and an overview of the strengths and limitations of data collection methods employed.