705 resultados para Microsoft Excel (Programa para computadora)
Resumo:
Camera traps have become a widely used technique for conducting biological inventories, generating a large number of database records of great interest. The main aim of this paper is to describe a new free and open source software (FOSS), developed to facilitate the management of camera-trapped data which originated from a protected Mediterranean area (SE Spain). In the last decade, some other useful alternatives have been proposed, but ours focuses especially on a collaborative undertaking and on the importance of spatial information underpinning common camera trap studies. This FOSS application, namely, “Camera Trap Manager” (CTM), has been designed to expedite the processing of pictures on the .NET platform. CTM has a very intuitive user interface, automatic extraction of some image metadata (date, time, moon phase, location, temperature, atmospheric pressure, among others), analytical (Geographical Information Systems, statistics, charts, among others), and reporting capabilities (ESRI Shapefiles, Microsoft Excel Spreadsheets, PDF reports, among others). Using this application, we have achieved a very simple management, fast analysis, and a significant reduction of costs. While we were able to classify an average of 55 pictures per hour manually, CTM has made it possible to process over 1000 photographs per hour, consequently retrieving a greater amount of data.
Resumo:
Il progetto di questa tesi nasce da un accordo stipulato tra il Laboratorio di Terminologia della Scuola di Lingue e Letterature, Traduzione e Interpretazione di Forlì e l’Organizzazione delle Nazioni Unite per l’Alimentazione e l’Agricoltura (FAO) per la stesura di tesi di laurea in ambito terminologico in collaborazione con il Servizio Programmazione e Documentazione delle Riunioni (CPAM) della FAO. Con il presente lavoro si intende svolgere una ricerca terminologica in inglese e in russo nell'ambito delle risorse genetiche animali e, nello specifico, delle biotecnologie e della salute animale e creare un database terminologico bilingue, inglese-russo, utilizzabile anche nei programmi di traduzione assistita. Nel primo capitolo viene presentata una panoramica della FAO, della sua storia, delle sue missioni e dei suoi obiettivi. Il secondo capitolo tratta la Divisione Produzione e Salute Animale della FAO che si occupa delle risorse genetiche animali e delle biotecnologie e salute animali, dominio e sottodominio del progetto per questa tesi. Il terzo capitolo riguarda le lingue speciali, le loro caratteristiche, il loro rapporto con la lingua comune e la tipologia testuale dei corpora utilizzati per la ricerca terminologica. Nel quarto capitolo si parla di terminologia e dell’attività terminologica illustrando approcci e correnti della terminologia moderna e descrivendo brevemente le peculiarità della terminologia delle biotecnologie. Infine, il quinto capitolo presenta l’intero progetto di ricerca terminologica, dalla fase preparatoria alla fase di revisione, soffermandosi, in particolare, sul procedimento di compilazione e conversione delle schede terminologiche per la creazione del database bilingue. In appendice, a fine tesi, sono allegate le mappe concettuali, le schede terminologiche del file di Microsoft Excel e il database terminologico bilingue inglese-russo.
Resumo:
The objective of this study was to compare the in vitro dissolution profile of a new rapidly absorbed paracetamol tablet containing sodium bicarbonate (PS) with that of a conventional paracetamol tablet (P), and to relate these by deconvolution and mapping to in vivo release. The dissolution methods used include the standard procedure described in the USP monograph for paracetamol tablets, employing buffer at pH5.8 or 0.05 M HCl at stirrer speeds between 10 and 50 rpm. The mapping process was developed and implemented in Microsoft Excel® worksheets that iteratively calculated the optimal values of scale and shape factors which linked in vivo time to in vitro time. The in vitro-in vivo correlation (IVIVC) was carried out simultaneously for both formulations to produce common mapping factors. The USP method, using buffer at pH5.8, demonstrated no difference between the two products. However, using an acidic medium the rate of dissolution of P but not of PS decreased with decreasing stirrer speed. A significant correlation (r=0.773; p<.00001) was established between in vivo release and in vitro dissolution using the profiles obtained with 0.05 M HCl and a stirrer speed of 30 rpm. The scale factor for optimal simultaneous IVIVC in the fasting state was 2.54 and the shape factor was 0.16; corresponding values for mapping in the fed state were 3.37 and 0.13 (implying a larger in vitro-in vivo time difference but reduced shape difference in the fed state). The current IVIVC explains, in part, the observed in vivo variability of the two products. The approach to mapping may also be extended to different batches of these products, to predict the impact of any changes of in vitro dissolution on in vivo release and plasma drug concentration-time profiles.
Resumo:
Background: Poor diet is thought to be a risk factor for many diseases, including age-related macular disease (ARMD), which is the leading cause of blind registration in those aged over 60 years in the developed world. The aims of this study were 1) to evaluate the dietary food intake of three subject groups: participants under the age of 50 years without ARMD (U50), participants over the age of 50 years without ARMD (O50), and participants with ARMD (AMD), and 2) to obtain information on nutritional supplement usage. Methods: A prospective cross-sectional study designed in a clinical practice setting. Seventy-four participants were divided into three groups: U50; 20 participants aged < 50 years, from 21 to 40 (mean ± SD, 37.7 ± 10.1 years), O50; 27 participants aged > 50 years, from 52 to 77 (62.7 ± 6.8 years), and ARMD; 27 participants aged > 50 years with ARMD, from 55 to 79 (66.0 ± 5.8 years). Participants were issued with a three-day food diary, and were also asked to provide details of any daily nutritional supplements. The diaries were analysed using FoodBase 2000 software. Data were input by one investigator and statistically analysed using Microsoft Excel for Microsoft Windows XP software, employing unpaired t-tests. Results: Group O50 consumed significantly more vitamin C (t = 3.049, p = 0.005) and significantly more fibre (t = 2.107, p = 0.041) than group U50. Group ARMD consumed significantly more protein (t = 3.487, p = 0.001) and zinc (t = 2.252, p = 0.029) than group O50. The ARMD group consumed the highest percentage of specific ocular health supplements and the U50 group consumed the most multivitamins. Conclusions: We did not detect a deficiency of any specific nutrient in the diets of those with ARMD compared with age- and gender-matched controls. ARMD patients may be aware of research into use of nutritional supplementation to prevent progression of their condition.
Resumo:
The purpose of this paper is to explain the notion of clustering and a concrete clustering method- agglomerative hierarchical clustering algorithm. It shows how a data mining method like clustering can be applied to the analysis of stocks, traded on the Bulgarian Stock Exchange in order to identify similar temporal behavior of the traded stocks. This problem is solved with the aid of a data mining tool that is called XLMiner™ for Microsoft Excel Office.
Resumo:
A cikk alapvető kérdése, hogy miképpen használható a tervezés a termelési folyamatok, s ezzel a vállalati m}uködés egészének hatékonyságnövelése érdekében. A termeléstervezés szintjei és eszközei közül a középtávú aggregált tervezésre koncentrálunk. Ennek oka elsősorban az, hogy tapasztalatunk szerinte tervezési szint gyakorlati alkalmazása még nem tekinthető elterjedtnek, s ebből következően az eszköz alaposabb ismerete és alkalmazásának elterjedése jelentős tartalékokat tárhat fel a m}uködési hatékonyság növelése terén. A dolgozat a termeléstervezés klasszikusnak tekinthető modelljét alkalmazza egy hazai vállalat esetében. Az elemzés során vizsgáljuk a modell alkalmazhatóságát és a különböző tervezési alternatívák hatását a hatékonyság növelésére. A modell számítógépes megoldását a Microsoft Excel Solver programjával végeztük. _______ The article demonstrates how production planning, especially aggregate production planning can positively influence the competitiveness of production firms. First the structure of production planning, different, but interconnected levels of it are introduced than the aggregate planning is elaborated in more details. Reason for focusing on aggregate planning lies in the fact that according to our experience aggregate planning is an operation planning method applied least of all production planning methods in Hungary. Due to this we are convinced that demonstrating a real case study in this area can help managers to realize that adopting it can significantly influence e±ciency in operation and represent important source of development. We applied a classic aggregate planning model for a Hungarian producing company. We have tested the adaptability of the model and also the effect of different concrete planning scenarios on efficiency. Solution of the mathematical model is calculated using the program of Microsoft Excel Solver.
Resumo:
Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.
Resumo:
The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim® Design and R430 Aspen HYSYS® V8.5. The results of this characterization were performed in spreadsheets of Microsoft® Excel®, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic® (VBA) for Excel®. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic® language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.
Resumo:
Underwater georeferenced photo-transect surveys were conducted on December 10-15, 2011 at various sections of the reef at Lizard Island, Great Barrier Reef. For this survey a snorkeler or diver swam over the bottom while taking photos of the benthos at a set height using a standard digital camera and towing a GPS in a surface float which logged the track every five seconds. A standard digital compact camera was placed in an underwater housing and fitted with a 16 mm lens which provided a 1.0 m x 1.0 m footprint, at 0.5 m height above the benthos. Horizontal distance between photos was estimated by three fin kicks of the survey diver/snorkeler, which corresponded to a surface distance of approximately 2.0 - 4.0 m. The GPS was placed in a dry-bag and logged the position as it floated at the surface while being towed by the photographer. A total of 5,735 benthic photos were taken. A floating GPS setup connected to the swimmer/diver by a line enabled recording of coordinates of each benthic photo (Roelfsema 2009). Approximation of coordinates of each benthic photo was conducted based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the GPS coordinates that were logged at a set time before and after the photo was captured. Benthic or substrate cover data was derived from each photo by randomly placing 24 points over each image using the Coral Point Count for Microsoft Excel program (Kohler and Gill, 2006). Each point was then assigned to 1 of 78 cover types, which represented the benthic feature beneath it. Benthic cover composition summary of each photo scores was generated automatically using CPCE program. The resulting benthic cover data of each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 55 South.
Resumo:
Underwater georeferenced photo-transect surveys were conducted on October 3-7, 2012 at various sections of the reef and lagoon at Lizard Island, Great Barrier Reef. For this survey a snorkeler swam while taking photos of the benthos at a set distance from the benthos using a standard digital camera and towing a GPS in a surface float which logged the track every five seconds. A Canon G12 digital camera was placed in a Canon underwater housing and photos were taken at 1 m height above the benthos. Horizontal distance between photos was estimated by three fin kicks of the survey snorkeler, which corresponded to a surface distance of approximately 2.0 - 4.0 m. The GPS was placed in a dry bag and logged the position at the surface while being towed by the photographer (Roelfsema, 2009). A total of 1,265 benthic photos were taken. Approximation of coordinates of each benthic photo was conducted based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the GPS coordinates that were logged at a set time before and after the photo was captured. Benthic or substrate cover data was derived from each photo by randomly placing 24 points over each image using the Coral Point Count for Microsoft Excel program (Kohler and Gill, 2006). Each point was then assigned to 1 of 79 cover types, which represented the benthic feature beneath it. Benthic cover composition summary of each photo scores was generated automatically using CPCE program. The resulting benthic cover data of each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 55 South.
Patient/carers' recollection of medicines related information from an out-patient clinic appointment
Resumo:
AIM: To identify what medicines related information children/young people or their parents/carers are able to recall following an out-patient clinic appointment. METHOD: A convenience sample of patients' prescribed at least one new long-term (>6 weeks) medicine were recruited from a single UK paediatric hospital out-patient pharmacy. A face-to-face semi-structured questionnaire was administered to participants when they presented with their prescription. The questionnaire included the following themes: names of the medicines, therapeutic indication, dose regimen, duration of treatment and adverse effects.The results were analysed using Microsoft Excel 2013. RESULTS: One hundred participants consented and were included in the study. One hundred and forty-five medicines were prescribed in total. Participants were able to recall the names of 96 (66%) medicines and were aware of the therapeutic indication for 142 (97.9%) medicines. The dose regimen was accurately described for 120 (82.8%) medicines with the duration of treatment known for 132 (91%). Participants mentioned that they had been advised about side effects for 44 (30.3%) medicines. Specific counselling points recommended by the BNFc1, were either omitted or not recalled by participants for the following systemic treatments: cetirizine (1), chlorphenamine (1), desmopressin (2), hydroxyzine (2), itraconazole (1), piroxicam (2), methotrexate (1), stiripentol (1) and topiramate (1). CONCLUSION: Following an out-patient consultation, where a new medicine is prescribed, children and their parents/carers are usually able to recall the indication, dose regimen and duration of treatment. Few were able to recall, or were told about, possible adverse effects. This may include some important drug specific effects that require vigilance during treatment.Patients, along with families and carers, should be involved in the decision to prescribe a medicine.2 This includes a discussion about the benefits of the medicine on the patient's condition and possible adverse effects.2 Treatment side effects have been shown to be a factor in treatment non-adherence in paediatric long-term medical conditions.3 Practitioners should explain to patients, and their family members or carers where appropriate, how to identify and report medicines-related patient safety incidents.4 However, this study suggests that medical staff may not be comfortable discussing the adverse effects of medicines with patients or their parents/carers.Further research in to the shared decision making process in the paediatric out-patient clinic when a new long-term medicine is prescribed is required to further support medicines adherence and the patient safety agenda.
Resumo:
The rise of the twenty-first century has seen the further increase in the industrialization of Earth’s resources, as society aims to meet the needs of a growing population while still protecting our environmental and natural resources. The advent of the industrial bioeconomy – which encompasses the production of renewable biological resources and their conversion into food, feed, and bio-based products – is seen as an important step in transition towards sustainable development and away from fossil fuels. One sector of the industrial bioeconomy which is rapidly being expanded is the use of biobased feedstocks in electricity production as an alternative to coal, especially in the European Union.
As bioeconomy policies and objectives increasingly appear on political agendas, there is a growing need to quantify the impacts of transitioning from fossil fuel-based feedstocks to renewable biological feedstocks. Specifically, there is a growing need to conduct a systems analysis and potential risks of increasing the industrial bioeconomy, given that the flows within it are inextricably linked. Furthermore, greater analysis is needed into the consequences of shifting from fossil fuels to renewable feedstocks, in part through the use of life cycle assessment modeling to analyze impacts along the entire value chain.
To assess the emerging nature of the industrial bioeconomy, three objectives are addressed: (1) quantify the global industrial bioeconomy, linking the use of primary resources with the ultimate end product; (2) quantify the impacts of the expaning wood pellet energy export market of the Southeastern United States; (3) conduct a comparative life cycle assessment, incorporating the use of dynamic life cycle assessment, of replacing coal-fired electricity generation in the United Kingdom with wood pellets that are produced in the Southeastern United States.
To quantify the emergent industrial bioeconomy, an empirical analysis was undertaken. Existing databases from multiple domestic and international agencies was aggregated and analyzed in Microsoft Excel to produce a harmonized dataset of the bioeconomy. First-person interviews, existing academic literature, and industry reports were then utilized to delineate the various intermediate and end use flows within the bioeconomy. The results indicate that within a decade, the industrial use of agriculture has risen ten percent, given increases in the production of bioenergy and bioproducts. The underlying resources supporting the emergent bioeconomy (i.e., land, water, and fertilizer use) were also quantified and included in the database.
Following the quantification of the existing bioeconomy, an in-depth analysis of the bioenergy sector was conducted. Specifically, the focus was on quantifying the impacts of the emergent wood pellet export sector that has rapidly developed in recent years in the Southeastern United States. A cradle-to-gate life cycle assessment was conducted in order to quantify supply chain impacts from two wood pellet production scenarios: roundwood and sawmill residues. For reach of the nine impact categories assessed, wood pellet production from sawmill residues resulted in higher values, ranging from 10-31% higher.
The analysis of the wood pellet sector was then expanded to include the full life cycle (i.e., cradle-to-grave). In doing to, the combustion of biogenic carbon and the subsequent timing of emissions were assessed by incorporating dynamic life cycle assessment modeling. Assuming immediate carbon neutrality of the biomass, the results indicated an 86% reduction in global warming potential when utilizing wood pellets as compared to coal for electricity production in the United Kingdom. When incorporating the timing of emissions, wood pellets equated to a 75% or 96% reduction in carbon dioxide emissions, depending upon whether the forestry feedstock was considered to be harvested or planted in year one, respectively.
Finally, a policy analysis of renewable energy in the United States was conducted. Existing coal-fired power plants in the Southeastern United States were assessed in terms of incorporating the co-firing of wood pellets. Co-firing wood pellets with coal in existing Southeastern United States power stations would result in a nine percent reduction in global warming potential.
Resumo:
Aim: To study the outcomes for restored primary molar teeth; to examine outcomes in relation to tooth type involved, intracoronal restoration complexity and to the material used. Materials and methods: Design: Retrospective study of primary molar teeth restored by intracoronal restorations. A series of restored primary molar teeth for children aged 6-12 years was studied. The principal outcome measure was failure of initial restoration (re-restoration or extraction). Three hundred patient records were studied to include three equal groups of primary molar teeth restored with amalgam, composite or glass ionomer, respectively. Restorative materials, the restoration type, simple (single surface) or complex (multi-surface) restoration, and tooth notation were recorded. Subsequent interventions were examined. Data were coded and entered into a Microsoft Excel database and analysis undertaken using SPSS v.18. Statistical differences were tested using the c2 test of statistical significance. Results: Of the 300 teeth studied, 61 restoration failures were recorded with 11 of those extracted. No significant differences were found between outcomes for upper first, upper second, lower first or lower second primary molars. Outcomes for simple primary teeth restored by intracoronal restorations were significantly better than those for complex intracoronal restorations (P = 0.042). Teeth originally restored with amalgam accounted for 19.7% of the 61 failures, composite for 29.5%, while teeth restored with glass ionomer represented 50.8% of all restoration failures. The differences were significant (P = 0.012). Conclusions: The majority (79.7%) of the 300 restored primary teeth studied were successful, and 3.7% teeth were extracted. Restorations involving more than one surface had almost twice the failure rate of single surface restorations. The difference was significant. Significant differences in failure rates for the three dental materials studied were recorded. Amalgam had the lowest failure rate while the failure rate with glass ionomer was the highest.
Resumo:
The dissertation consists of three chapters related to the low-price guarantee marketing strategy and energy efficiency analysis. The low-price guarantee is a marketing strategy in which firms promise to charge consumers the lowest price among their competitors. Chapter 1 addresses the research question "Does a Low-Price Guarantee Induce Lower Prices'' by looking into the retail gasoline industry in Quebec where there was a major branded firm which started a low-price guarantee back in 1996. Chapter 2 does a consumer welfare analysis of low-price guarantees to drive police indications and offers a new explanation of the firms' incentives to adopt a low-price guarantee. Chapter 3 develops the energy performance indicators (EPIs) to measure energy efficiency of the manufacturing plants in pulp, paper and paperboard industry.
Chapter 1 revisits the traditional view that a low-price guarantee results in higher prices by facilitating collusion. Using accurate market definitions and station-level data from the retail gasoline industry in Quebec, I conducted a descriptive analysis based on stations and price zones to compare the price and sales movement before and after the guarantee was adopted. I find that, contrary to the traditional view, the stores that offered the guarantee significantly decreased their prices and increased their sales. I also build a difference-in-difference model to quantify the decrease in posted price of the stores that offered the guarantee to be 0.7 cents per liter. While this change is significant, I do not find the response in comeptitors' prices to be significant. The sales of the stores that offered the guarantee increased significantly while the competitors' sales decreased significantly. However, the significance vanishes if I use the station clustered standard errors. Comparing my observations and the predictions of different theories of modeling low-price guarantees, I conclude the empirical evidence here supports that the low-price guarantee is a simple commitment device and induces lower prices.
Chapter 2 conducts a consumer welfare analysis of low-price guarantees to address the antitrust concerns and potential regulations from the government; explains the firms' potential incentives to adopt a low-price guarantee. Using station-level data from the retail gasoline industry in Quebec, I estimated consumers' demand of gasoline by a structural model with spatial competition incorporating the low-price guarantee as a commitment device, which allows firms to pre-commit to charge the lowest price among their competitors. The counterfactual analysis under the Bertrand competition setting shows that the stores that offered the guarantee attracted a lot more consumers and decreased their posted price by 0.6 cents per liter. Although the matching stores suffered a decrease in profits from gasoline sales, they are incentivized to adopt the low-price guarantee to attract more consumers to visit the store likely increasing profits at attached convenience stores. Firms have strong incentives to adopt a low-price guarantee on the product that their consumers are most price-sensitive about, while earning a profit from the products that are not covered in the guarantee. I estimate that consumers earn about 0.3% more surplus when the low-price guarantee is in place, which suggests that the authorities should not be concerned and regulate low-price guarantees. In Appendix B, I also propose an empirical model to look into how low-price guarantees would change consumer search behavior and whether consumer search plays an important role in estimating consumer surplus accurately.
Chapter 3, joint with Gale Boyd, describes work with the pulp, paper, and paperboard (PP&PB) industry to provide a plant-level indicator of energy efficiency for facilities that produce various types of paper products in the United States. Organizations that implement strategic energy management programs undertake a set of activities that, if carried out properly, have the potential to deliver sustained energy savings. Energy performance benchmarking is a key activity of strategic energy management and one way to enable companies to set energy efficiency targets for manufacturing facilities. The opportunity to assess plant energy performance through a comparison with similar plants in its industry is a highly desirable and strategic method of benchmarking for industrial energy managers. However, access to energy performance data for conducting industry benchmarking is usually unavailable to most industrial energy managers. The U.S. Environmental Protection Agency (EPA), through its ENERGY STAR program, seeks to overcome this barrier through the development of manufacturing sector-based plant energy performance indicators (EPIs) that encourage U.S. industries to use energy more efficiently. In the development of the energy performance indicator tools, consideration is given to the role that performance-based indicators play in motivating change; the steps necessary for indicator development, from interacting with an industry in securing adequate data for the indicator; and actual application and use of an indicator when complete. How indicators are employed in EPA’s efforts to encourage industries to voluntarily improve their use of energy is discussed as well. The chapter describes the data and statistical methods used to construct the EPI for plants within selected segments of the pulp, paper, and paperboard industry: specifically pulp mills and integrated paper & paperboard mills. The individual equations are presented, as are the instructions for using those equations as implemented in an associated Microsoft Excel-based spreadsheet tool.
Resumo:
Background: Children with disabilities living in low and middle income countries’ perceptions of participation are not shown in research. These perceptions are important for providing appropriate interventions. Aim: To describe how children aged 8-12 with an intellectual disability living in Ethiopia perceive their situation regarding participation in activities in everyday life. Method: A descriptive design with a quantitative approach was used. The sample was gathered using consecutive sampling. Fifteen structured interviews were conducted, using “Picture my participation,” an instrument under development. Analyses were made using SPSS Statistics and Microsoft Excel. Results: The children perceived that they participated in activities in everyday life. There was a broad variation in the activities the children prioritized as most important. On a group level, they were very involved in these activities. The majority did not experience any barriers to perform these activities. Conclusions: The perceptions of the majority of the children were that they were involved in daily activities. They did not experience any barriers to participation. The results should be read with caution and generalization is not possible, due to the sample characteristics and that the instrument is under development.