946 resultados para Monitor
Resumo:
The study focused on the different ways that forest-related rights can be devolved to the local level according to the current legal frameworks in Laos, Nepal, Vietnam, Kenya, Mozambique and Tanzania. The eleven case studies represented the main ways in which forest-related rights can be devolved to communities or households in these countries. The objectives of this study were to 1) analyse the contents and extent of forest-related rights that can be devolved to the local level, 2) develop an empirical typology that represents the main types of devolution, and 3) compare the cases against a theoretical ideal type to assess in what way and to what extent the cases are similar to or differ from the theoretical construct. Fuzzy set theory, Qualitative Comparative Analysis and ideal type analysis were used in analysing the case studies and in developing an empirical typology. The theoretical framework, which guided data collection and analyses, was based on institutional economics and theories on property rights, common pool resources and collective action. On the basis of the theoretical and empirical knowledge, the most important attributes of rights were defined as use rights, management rights, exclusion rights, transfer rights and the duration and security of the rights. The ideal type was defined as one where local actors have been devolved comprehensive use rights, extensive management rights, rights to exclude others from the resource and rights to transfer these rights. In addition, the rights are to be secure and held perpetually. The ideal type was used to structure the analysis and as a tool against which the cases were analysed. The contents, extent and duration of the devolved rights varied greatly. In general, the results show that devolution has mainly meant the transfer of use rights to the local level, and has not really changed the overall state control over forest resources. In most cases the right holders participate, or have a limited role in the decision making regarding the harvesting and management of the resource. There was a clear tendency to devolve the rights to enforce rules and to monitor resource use and condition more extensively than the powers to decide on the management and development of the resource. The empirical typology of the cases differentiated between five different types of devolution. The types can be characterised by the devolution of 1) restricted use and control rights, 2) extensive use rights but restricted control rights, 3) extensive rights, 4) insecure, short term use and restricted control rights, and 5) insecure extensive rights. Overall, the case studies conformity to the ideal type was very low: only two cases were similar to the ideal type, all other cases differed considerably from the ideal type. The restricted management rights were the most common reason for the low conformity to the ideal type (eight cases). In three cases, the short term of the rights, restricted transfer rights, restricted use rights or restricted exclusion rights were the reason or one of the reasons for the low conformity to the ideal type. In two cases the rights were not secure.
Resumo:
The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.
Resumo:
A new two-stage state feedback control design approach has been developed to monitor the voltage supplied to magnetorheological (MR) dampers for semi-active vibration control of the benchmark highway bridge. The first stage contains a primary controller, which provides the force required to obtain a desired closed-loop response of the system. In the second stage, an optimal dynamic inversion (ODI) approach has been developed to obtain the amount of voltage to be supplied to each of the MR dampers such that it provides the required force prescribed by the primary controller. ODI is formulated by optimization with dynamic inversion, such that an optimal voltage is supplied to each damper in a set. The proposed control design has been simulated for both phase-I and phase-II study of the recently developed benchmark highway bridge problem. The efficiency of the proposed controller is analyzed in terms of the performance indices defined in the benchmark problem definition. Simulation results demonstrate that the proposed approach generally reduces peak response quantities over those obtained from the sample semi-active controller, although some response quantities have been seen to be increasing. Overall, the proposed control approach is quite competitive as compared with the sample semi-active control approach.
Resumo:
Southern Hemisphere plantation forestry has grown substantially over the past few decades and will play an increasing role in fibre production and carbon sequestration in future. The sustainability of these plantations is, however, increasingly under pressure from introduced pests. This pressure requires an urgent and matching increase in the speed and efficiency at which tools are developed to monitor and control these pests. To consider the potential role of semiochemicals to address the need for more efficient pest control in Southern Hemisphere plantations, particularly by drawing from research in other parts of the world. Semiochemical research in forestry has grown exponentially over the last 40 years but has been almost exclusively focussed on Northern Hemisphere forests. In these forests, semiochemicals have played an important role to enhance the efficiency of integrated pest management programmes. An analysis of semiochemical research from 1970 to 2010 showed a rapid increase over time. It also indicated that pheromones have been the most extensively studied type of semiochemical in forestry, contributing to 92% of the semiochemical literature over this period, compared with research on plant kairomones. This research has led to numerous applications in detection of new invasions, monitoring population levels and spread, in addition to controlling pests by mass trapping or disrupting of aggregation and mating signals. The value of semiochemicals as an environmentally benign and efficient approach to managing forest plantation pests in the Southern Hemisphere seems obvious. There is, however, a lack of research capacity and focus to optimally capture this opportunity. Given the pressure from increasing numbers of pests and reduced opportunities to use pesticides, there is some urgency to develop semiochemical research capacity.
Resumo:
Differences in morphology have provided a basis for detecting natural interspecific hybridisation in forest trees for decades but have come to prominence again more recently as a means for directly measuring gene flow from planted forests. Here we examined the utility of seedling morphology for hybrid discrimination in three hybrid groups relevant to the monitoring of gene flow from plantings of Corymbia (L.D. Pryor & L.A.S. Johnson ex Brooker) taxa in subtropical Australia. Thirty leaf and stem characters were assessed on 907 8-month old seedlings from four parental and six hybrid taxa grown in a common garden. Outbred F1 hybrids between spotted gums (Corymbia citriodora subspecies variegata, C. citriodora subspecies citriodora and Corymbia henryi) tended to more closely resemble their maternal Corymbia torelliana parent and the most discriminating characters were the ratio of blade length to maximum perpendicular width, the presence or absence of a lignotuber, and specific leaf weight. Assignment of individuals into genealogical classes based on a multivariate model limited to a set of the more discriminating and independent characters was highest in the hybrid group, where parental taxa were genetically most divergent. Overall power to resolve among outbred F1 hybrids from both parental taxa was low to moderate, but this may not be a limitation to its likely major application of identifying hybrids in seedlots from native spotted gum stands. Advanced generation hybrids (outbred F2 and outbred backcrosses) were more difficult to resolve reliably due to the higher variances of hybrid taxa and the tendency of backcrosses to resemble their recurrent parents. Visual assessments of seedling morphology may provide a filter allowing screening of the large numbers needed to monitor gene flow, but will need to be combined with other hybrid detection methods to ensure hybrids are detected.
Resumo:
Loss of nitrogen in deep drainage from agriculture is an important issue for environmental and economic reasons, but limited field data is available for tropical crops. In this study, nitrogen (N) loads leaving the root zone of two major humid tropical crops in Australia, sugarcane and bananas, were measured. The two field sites, 57 km apart, had a similar soil type (a well drained Dermosol) and rainfall (∼2700 mm year -1) but contrasting crops and management. A sugarcane crop in a commercial field received 136-148 kg N ha -1 year -1 applied in one application each year and was monitored for 3 years (first to third ratoon crops). N treatments of 0-600 kg ha -1 year -1 were applied to a plant and following ratoon crop of bananas. N was applied as urea throughout the growing season in irrigation water through mini-sprinklers. Low-suction lysimeters were installed at a depth of 1 m under both crops to monitor loads of N in deep drainage. Drainage at 1 m depth in the sugarcane crops was 22-37% of rainfall. Under bananas, drainage in the row was 65% of rainfall plus irrigation for the plant crop, and 37% for the ratoon. Nitrogen leaching loads were low under sugarcane (<1-9 kg ha -1 year -1) possibly reflecting the N fertiliser applications being reasonably matched to crop requirements and at least 26 days between fertiliser application and deep drainage. Under bananas, there were large loads of N in deep drainage when N application rates were in excess of plant demand, even when applied fortnightly. The deep drainage loss of N attributable to N fertiliser, calculated by subtracting the loss from unfertilised plots, was 246 and 641 kg ha -1 over 2 crop cycles, which was equivalent to 37 and 63% of the fertiliser application for treatments receiving 710 and 1065 kg ha -1, respectively. Those rates of fertiliser application resulted in soil acidification to a depth of 0.6 m by as much as 0.6 of a unit at 0.1-0.2 m depth. The higher leaching losses from bananas indicated that they should be a priority for improved N management. Crown Copyright © 2012.
Resumo:
The aim of the current study was to investigate whether polymerase chain reaction amplification of 16S ribosomal (r)RNA and a putative hemolysin gene operon, hhdBA, can be used to monitor live pigs for the presence of Haemophilus parasuis and predict the virulence of the strains present. Nasal cavity swabs were taken from 30 live, healthy, 1- to 8-week-old pigs on a weekly cycle from a commercial Thai nursery pig herd. A total of 27 of these pigs (90%) tested positive for H. parasuis as early as week 1 of age. None of the H. parasuis-positive samples from healthy pigs was positive for the hhdBA genes. At the same pig nursery, swab samples from nasal cavity, tonsil, trachea, and lung, and exudate samples from pleural/peritoneal cavity were taken from 30 dead pigs displaying typical pathological lesions consistent with Glasser disease. Twenty-two of 140 samples (15.7%) taken from 30 diseased pigs yielded a positive result for H. parasuis. Samples from the exudate (27%) yielded the most positive results, followed by lung, tracheal swab, tonsil, and nasal swab, respectively. Out of 22 positive samples, 12 samples (54.5%) harbored hhdA and/or hhdB genes. Detection rates of hhdA were higher than hhdB. None of the H. parasuis-positive samples taken from nasal cavity of diseased pigs tested positive for hhdBA genes. More work is required to determine if the detection of hhdBA genes is useful for identifying the virulence potential of H. parasuis field isolates.
Resumo:
Hydrogen cyanide (HCN) is a toxic chemical that can potentially cause mild to severe reactions in animals when grazing forage sorghum. Developing technologies to monitor the level of HCN in the growing crop would benefit graziers, so that they can move cattle into paddocks with acceptable levels of HCN. In this study, we developed near-infrared spectroscopy (MRS) calibrations to estimate HCN in forage sorghum and hay. The full spectral NIRS range (400-2498 nm) was used as well as specific spectral ranges within the full spectral range, i.e., visible (400-750 nm), shortwave (800-1100 nm) and near-infrared (NIR) (1100-2498 nm). Using the full spectrum approach and partial least-squares (PLS), the calibration produced a coefficient of determination (R-2) = 0.838 and standard error of cross-validation (SECV) = 0.040%, while the validation set had a R-2 = 0.824 with a low standard error of prediction (SEP = 0.047%). When using a multiple linear regression (MLR) approach, the best model (NIR spectra) produced a R-2 = 0.847 and standard error of calibration (SEC) = 0.050% and a R-2 = 0.829 and SEP = 0.057% for the validation set. The MLR models built from these spectral regions all used nine wavelengths. Two specific wavelengths 2034 and 2458 nm were of interest, with the former associated with C=O carbonyl stretch and the latter associated with C-N-C stretching. The most accurate PLS and MLR models produced a ratio of standard error of prediction to standard deviation of 3.4 and 3.0, respectively, suggesting that the calibrations could be used for screening breeding material. The results indicated that it should be feasible to develop calibrations using PLS or MLR models for a number of users, including breeding programs to screen for genotypes with low HCN, as well as graziers to monitor crop status to help with grazing efficiency.
Resumo:
Objective To examine the combined effects of physical activity and weight status on blood pressure (BP) in preschool-aged children. Study design The sample included 733 preschool-aged children (49% female). Physical activity was objectively assessed on 7 consecutive days by accelerometry. Children were categorized as sufficiently active if they met the recommendation of at least 60 minutes daily of moderate-to-vigorous physical activity (MVPA). Body mass index was used to categorize children as nonoverweight or overweight/obese, according to the International Obesity Task Force benchmarks. BP was measured using an automated BP monitor and categorized as elevated or normal using BP percentile-based cut-points for age, sex, and height. Results The prevalence of elevated systolic BP (SBP) and diastolic BP was 7.7% and 3.0%, respectively. The prevalence of overweight/obese was 32%, and about 15% of children did not accomplish the recommended 60 minutes of daily MVPA. After controlling for age and sex, overweight/obese children who did not meet the daily MVPA recommendation were 3 times more likely (OR 3.8; CI 1.6-8.6) to have elevated SBP than nonoverweight children who met the daily MVPA recommendation. Conclusions Overweight or obese preschool-aged children with insufficient levels of MVPA are at significantly greater risk for elevated SBP than their nonoverweight and sufficiently active counterparts.
Resumo:
Historical stocking methods of continuous, season-long grazing of pastures with little account of growing conditions have caused some degradation within grazed landscapes in northern Australia. Alternative stocking methods have been implemented to address this degradation and raise the productivity and profitability of the principal livestock, cattle. Because information comparing stocking methods is limited, an evaluation was undertaken to quantify the effects of stocking methods on pastures, soils and grazing capacity. The approach was to monitor existing stocking methods on nine commercial beef properties in north and south Queensland. Environments included native and exotic pastures and eucalypt (lighter soil) and brigalow (heavier soil) land types. Breeding and growing cattle were grazed under each method. The owners/managers, formally trained in pasture and grazing management, made all management decisions affecting the study sites. Three stocking methods were compared: continuous (with rest), extensive rotation and intensive rotation (commonly referred to as 'cell grazing'). There were two or three stocking methods examined on each property: in total 21 methods (seven continuous, six extensive rotations and eight intensive rotations) were monitored over 74 paddocks, between 2006 and 2009. Pasture and soil surface measurements were made in the autumns of 2006, 2007 and 2009, while the paddock grazing was analysed from property records for the period from 2006 to 2009. The first 2 years had drought conditions (rainfall average 3.4 decile) but were followed by 2 years of above-average rainfall. There were no consistent differences between stocking methods across all sites over the 4 years for herbage mass, plant species composition, total and litter cover, or landscape function analysis (LFA) indices. There were large responses to rainfall in the last 2 years with mean herbage mass in the autumn increasing from 1970 kg DM ha(-1) in 2006-07 to 3830 kg DM ha(-1) in 2009. Over the same period, ground and litter cover and LFA indices increased. Across all sites and 4 years, mean grazing capacity was similar for the three stocking methods. There were, however, significant differences in grazing capacity between stocking methods at four sites but these differences were not consistent between stocking methods or sites. Both the continuous and intensive rotation methods supported the highest average annual grazing capacity at different sites. The results suggest that cattle producers can obtain similar ecological responses and carry similar numbers of livestock under any of the three stocking methods.
Resumo:
Aim This study evaluated the validity of the OMNI Walk/Run Rating of Perceived Exertion (OMNI-RPE) scores with heart rate and oxygen consumption (VO2) for children and adolescents with cerebral palsy (CP). Method Children and adolescents with CP, aged 6 to 18 years and Gross Motor Function Classification System (GMFCS) levels I to III completed a physical activity protocol with seven trials ranging in intensity from sedentary to moderate-to-vigorous. VO2 and heart rate were recorded during the physical activity trials using a portable indirect calorimeter and heart rate monitor. Participants reported OMNI-RPE scores for each trial. Concurrent validity was assessed by calculating the average within-subject correlation between OMNI-RPE ratings and the two physiological indices. Results For the correlational analyses, 48 participants (22 males, 26 females; age 12y 6mo, SD 3y 4mo) had valid bivariate data for VO2 and OMNI-RPE, while 40 participants (21 males, 19 females; age 12y 5mo, SD 2y 9mo) had valid bivariate data for heart rate and OMNI-RPE. VO2 (r=0.80; 95% CI 0.66–0.88) and heart rate (r=0.83; 95% CI 0.70–0.91) were moderately to highly correlated to OMNI-RPE scores. No difference was found for the correlation of physiological data and OMNI-RPE scores across the three GMFCS levels. The OMNI-RPE scores increased significantly in a dose-response manner (F6,258=116.1, p<0.001) as exercise intensity increased from sedentary to moderate-to-vigorous. Interpretation OMNI-RPE is a clinically feasible option to monitor exercise intensity in ambulatory children and adolescents with CP.
Resumo:
Purpose This study investigated how nitrogen (N) nutrition and key physiological processes varied under changed water and nitrogen competition resulting from different weed control and fertilisation treatments in a 2-year-old F1 hybrid (Pinus elliottii Engelm var. elliottii × P. caribaea var. hondurensis Barr. ex Golf.) plantation on a grey podzolic soil type, in Southeast Queensland. Materials and methods The study integrated a range of measures including growth variables (diameter at ground level (DGL), diameter at breast height (DBH) and height (H)), foliar variables (including foliar N concentration, foliar δ13C and δ15N) and physiological variables (including photosynthesis (An), stomatal conductance (gs), transpiration (E), intrinsic water use efficiency (WUEi) (A/gs) and xylem pressure potential (ΨXPP)) to better understand the mechanisms influencing growth under different weed control and fertilisation treatments. Five levels of weed control were applied: standard (routine), luxury, intermediate, mechanical and nil weed control, all with routine fertilisation plus an additional treatment, routine weed control and luxury fertilisation. Relative weed cover was assessed at 0.8, 1.1 and 1.6 years after plantation establishment to monitor the effectiveness of weed control treatments. Soil investigation included soil ammonium (NH4 +-N), nitrate (NO3 −-N), potentially mineralizable N (PMN), gravimetric soil moisture content (MC), hot water extractable organic carbon (HWETC), hot water extractable total N (HWETN), total C, total N, stable C isotope composition (δ13C), stable N isotope composition (δ15N), total P and extractable K. Results and discussion There were significant relationships between foliar N concentrations and relative weed cover and between tree growth and foliar N concentration or foliar δ15N, but initial site preparation practices also increased soil N transformations in the planting rows reducing the observable effects of weed control on foliar δ15N. A positive relationship between foliar N concentration and foliar δ13C or photosynthesis indicated that increased N availability to trees positively influenced non-stomatal limitations to photosynthesis. However, trees with increased foliar N concentrations and photosynthesis were negatively related to xylem pressure potential in the afternoons which enhanced stomatal limitations to photosynthesis and WUEi. Conclusions Luxury and intermediate weed control and luxury fertilisation positively influenced growth at early establishment by reducing the competition for water and N resources. This influenced fundamental key physiological processes such as the relationships between foliar N concentration, A n, E, gs and ΨXPP. Results also confirmed that time from cultivation is an important factor influencing the effectiveness of using foliar δ15N as an indicator of soil N transformations.
Resumo:
- Objective This study examined chronic disease risks and the use of a smartphone activity tracking application during an intervention in Australian truck drivers (April-October 2014). - Methods Forty-four men (mean age=47.5 [SD 9.8] years) completed baseline health measures, and were subsequently offered access to a free wrist-worn activity tracker and smartphone application (Jawbone UP) to monitor step counts and dietary choices during a 20-week intervention. Chronic disease risks were evaluated against guidelines; weekly step count and dietary logs registered by drivers in the application were analysed to evaluate use of the Jawbone UP. - Results Chronic disease risks were high (e.g. 97% high waist circumference [≥94 cm]). Eighteen drivers (41%) did not start the intervention; smartphone technical barriers were the main reason for drop out. Across 20-weeks, drivers who used the Jawbone UP logged step counts for an average of 6 [SD 1] days/week; mean step counts remained consistent across the intervention (weeks 1–4=8,743[SD 2,867] steps/day; weeks 17–20=8,994[SD 3,478] steps/day). The median number of dietary logs significantly decreased from start (17 [IQR 38] logs/weeks) to end of the intervention (0 [IQR 23] logs/week; p<0.01); the median proportion of healthy diet choices relative to total diet choices logged increased across the intervention (weeks 1–4=38[IQR 21]%; weeks 17–20=58[IQR 18]%). - Conclusions Step counts were more successfully monitored than dietary choices in those drivers who used the Jawbone UP. - Implications Smartphone technology facilitated active living and healthy dietary choices, but also prohibited intervention engagement in a number of these high-risk Australian truck drivers.
Resumo:
The objective of this study was to investigate patterns of soil water extraction and drought resistance among genotypes of bermudagrass (Cynodon spp.) a perennial C-4 grass. Four wild Australian ecotypes (1-1, 25a1, 40-1, and 81-1) and four cultivars (CT2, Grand Prix, Legend, and Wintergreen) were examined in field experiments with rainfall excluded to monitor soil water extraction at 30-190 cm depths. In the study we defined drought resistance as the ability to maintain green canopy cover under drought. The most drought resistant genotypes (40-1 and 25a1) maintained more green cover (55-85% vs 5-10%) during water deficit and extracted more soil water (120-160 mm vs 77-107 mm) than drought sensitive genotypes, especially at depths from 50 to 110 cm, though all genotypes extracted water to 190 cm. The maintenance of green cover and higher soil water extraction were associated with higher stomatal conductance, photosynthetic rate and relative water content. For all genotypes, the pattern of water use as a percentage of total water use was similar across depth and time We propose the observed genetic variation was related to different root characteristics (root length density, hydraulic conductivity, root activity) although shoot sensitivity to drying soil cannot be ruled out.
Resumo:
The treatment of large segmental bone defects remains a significant clinical challenge. Due to limitations surrounding the use of bone grafts, tissue-engineered constructs for the repair of large bone defects could offer an alternative. Before translation of any newly developed tissue engineering (TE) approach to the clinic, efficacy of the treatment must be shown in a validated preclinical large animal model. Currently, biomechanical testing, histology, and microcomputed tomography are performed to assess the quality and quantity of the regenerated bone. However, in vivo monitoring of the progression of healing is seldom performed, which could reveal important information regarding time to restoration of mechanical function and acceleration of regeneration. Furthermore, since the mechanical environment is known to influence bone regeneration, and limb loading of the animals can poorly be controlled, characterizing activity and load history could provide the ability to explain variability in the acquired data sets and potentially outliers based on abnormal loading. Many approaches have been devised to monitor the progression of healing and characterize the mechanical environment in fracture healing studies. In this article, we review previous methods and share results of recent work of our group toward developing and implementing a comprehensive biomechanical monitoring system to study bone regeneration in preclinical TE studies.