970 resultados para Cooking losses


Relevância:

10.00% 10.00%

Publicador:

Resumo:

One major benefit of land application of biosolids is to supply nitrogen (N) for agricultural crops, and understanding mineralisation processes is the key for better N-management strategies. Field studies were conducted to investigate the process of mineralisation of three biosolids products (aerobic, anaerobic, and thermally dried biosolids) incorporated into four different soils at rates of 7-90 wet t/ha in subtropical Queensland. Two of these studies also examined mineralisation rates of commonly used organic amendments (composts, manures, and sugarcane mill muds). Organic N in all biosolids products mineralised very rapidly under ambient conditions in subtropical Queensland, with rates much faster than from other common amendments. Biosolids mineralisation rates ranged from 30 to 80% of applied N during periods ranging from 3.5 to 18 months after biosolids application; these rates were much higher than those suggested in the biosolids land application guidelines established by the NSW EPA (15% for anaerobic and 25% for aerobic biosolids). There was no consistently significant difference in mineralisation rate between aerobic and anaerobic biosolids in our studies. When applied at similar rates of N addition, other organic amendments supplied much less N to the soil mineral N and plant N pools during the crop season. A significant proportion of the applied biosolids total N (up to 60%) was unaccounted for at the end of the observation period. High rates of N addition in calculated Nitrogen Limited Biosolids Application Rates (850-1250 kg N/ha) resulted in excessive accumulation of mineral N in the soil profile, which increases the environmental risks due to leaching, runoff, or gaseous N losses. Moreover, the rapid mineralisation of the biosolids organic N in these subtropical environments suggests that biosolids should be applied at lower rates than in temperate areas, and that care must be taken with the timing to maximise plant uptake and minimise possible leaching, runoff, or denitrification losses of mineralised N.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rhipicephalus micro plus is an important bovine ectoparasite, widely distributed in tropical and subtropical regions of the world causing large economic losses to the cattle industry. Its success as an ectoparasite is associated with its capacity to disarm the antihemostatic and anti-inflammatory reactions of the host. Serpins are protease inhibitors with an important role in the modulation of host-parasite interactions. The cDNA that encodes for a R. microplus serpin was isolated by RACE and subsequently cloned into the pPICZ alpha A vector. Sequence analysis of the cDNA and predicted amino acid showed that this cDNA has a conserved serpin domain. B- and T-cell epitopes were predicted using bioinformatics tools. The recombinant R. microplus serpin (rRMS-3) was secreted into the culture media of Pichia pastoris after methanol induction at 0.2 mg l(-1) qRT-PCR expression analysis of tissues and life cycle stages demonstrated that RMS-3 was mainly expressed in the salivary glands of female adult ticks. Immunological recognition of the rRMS-3 and predicted B-cell epitopes was tested using tick-resistant and susceptible cattle sera. Only sera from tick-resistant bovines recognized the B-cell epitope AHYNPPPPIEFT (Seq7). The recombinant RMS-3 was expressed in P. pastoris, and ELISA screening also showed higher recognition by tick-resistant bovine sera. The results obtained suggest that RMS-3 is highly and specifically secreted into the bite site of R. microplus feeding on tick-resistant bovines. Capillary feeding of semi-engorged ticks with anti-AHYNPPPPIEFT sheep sera led to an 81.16% reduction in the reproduction capacity of R. microplus. Therefore, it is possible to conclude that R. microplus serpin (RMS-3) has an important role in the host-parasite interaction to overcome the immune responses in resistant cattle. (C) 2012 Elsevier GmbH. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In agricultural systems which rely on organic sources of nitrogen (N), of which the primary source is biological N fixation (BNF), it is extremely important to use N as efficiently as possible with minimal losses to the environment. The amount of N through BNF should be maximised and the availability of the residual N after legumes should be synchronised to the subsequent plant needs in the crop rotation. Six field experiments in three locations in Finland were conducted in 1994-2006 to determine the productivity and amount of BNF in red clover-grass leys of different ages. The residual effects of the leys for subsequent cereals as well as the N leaching risk were studied by field measurements and by simulation using the CoupModel. N use efficiency (NUE) and N balances were also calculated. The yields of red clover-grass leys were highest in the two-year-old leys (6 700 kg ha-1) under study, but the differences between 2- and 3-year old leys were not high in most cases. BNF (90 kg ha-1 in harvested biomass) correlated strongly with red clover dry matter yield, as the proportion of red clover N derived from the atmosphere (> 85%) was high in our conditions of organically farmed field with low soil mineral N. A red clover content of over 40% in dry matter is targeted to avoid negative N-balances and to gain N for the subsequent crop. Surprisingly, the leys had no significant effect on the yields and N uptake of the two subsequent cereals (winter rye or spring wheat, followed by spring oats). On the other hand, yield and C:N of leys, as well as BNF-N and total-N incorporated into the soil influenced on subsequent cereal yields. NUE of cereals from incorporated ley crop residues was rather high, varying from 30% to 80% (mean 48%). The mineral N content of soil in the profile of 0-90 cm was low, mainly 15-30 kg ha-1. Simulation of N dynamics by CoupModel functioned satisfactorily and is considered a useful tool to estimate N flows in cropping systems relying on organic N sources. Understanding the long-term influence of cultivation history and soil properties on N dynamics remains to be a challenge to further research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context. Irregular plagues of house mice cause high production losses in grain crops in Australia. If plagues can be forecast through broad-scale monitoring or model-based prediction, then mice can be proactively controlled by poison baiting. Aims. To predict mouse plagues in grain crops in Queensland and assess the value of broad-scale monitoring. Methods. Regular trapping of mice at the same sites on the Darling Downs in southern Queensland has been undertaken since 1974. This provides an index of abundance over time that can be related to rainfall, crop yield, winter temperature and past mouse abundance. Other sites have been trapped over a shorter time period elsewhere on the Darling Downs and in central Queensland, allowing a comparison of mouse population dynamics and cross-validation of models predicting mouse abundance. Key results. On the regularly trapped 32-km transect on the Darling Downs, damaging mouse densities occur in 50% of years and a plague in 25% of years, with no detectable increase in mean monthly mouse abundance over the past 35 years. High mouse abundance on this transect is not consistently matched by high abundance in the broader area. Annual maximum mouse abundance in autumn–winter can be predicted (R2 = 57%) from spring mouse abundance and autumn–winter rainfall in the previous year. In central Queensland, mouse dynamics contrast with those on the Darling Downs and lack the distinct annual cycle, with peak abundance occurring in any month outside early spring.Onaverage, damaging mouse densities occur in 1 in 3 years and a plague occurs in 1 in 7 years. The dynamics of mouse populations on two transects ~70 km apart were rarely synchronous. Autumn–winter rainfall can indicate mouse abundance in some seasons (R2 = ~52%). Conclusion. Early warning of mouse plague formation in Queensland grain crops from regional models should trigger farm-based monitoring. This can be incorporated with rainfall into a simple model predicting future abundance that will determine any need for mouse control. Implications. A model-based warning of a possible mouse plague can highlight the need for local monitoring of mouse activity, which in turn could trigger poison baiting to prevent further mouse build-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global cereal production will need to increase by 50% to 70% to feed a world population of about 9 billion by 2050. This intensification is forecast to occur mostly in subtropical regions, where warm and humid conditions can promote high N2O losses from cropped soils. To secure high crop production without exacerbating N2O emissions, new nitrogen (N) fertiliser management strategies are necessary. This one-year study evaluated the efficacy of a nitrification inhibitor (3,4-dimethylpyrazole phosphate—DMPP) and different N fertiliser rates to reduce N2O emissions in a wheat–maize rotation in subtropical Australia. Annual N2O emissions were monitored using a fully automated greenhouse gas measuring system. Four treatments were fertilized with different rates of urea, including a control (40 kg-N ha−1 year−1), a conventional N fertiliser rate adjusted on estimated residual soil N (120 kg-N ha−1 year−1), a conventional N fertiliser rate (240 kg-N ha−1 year−1) and a conventional N fertiliser rate (240 kg-N ha−1 year−1) with nitrification inhibitor (DMPP) applied at top dressing. The maize season was by far the main contributor to annual N2O emissions due to the high soil moisture and temperature conditions, as well as the elevated N rates applied. Annual N2O emissions in the four treatments amounted to 0.49, 0.84, 2.02 and 0.74 kg N2O–N ha−1 year−1, respectively, and corresponded to emission factors of 0.29%, 0.39%, 0.69% and 0.16% of total N applied. Halving the annual conventional N fertiliser rate in the adjusted N treatment led to N2O emissions comparable to the DMPP treatment but extensively penalised maize yield. The application of DMPP produced a significant reduction in N2O emissions only in the maize season. The use of DMPP with urea at the conventional N rate reduced annual N2O emissions by more than 60% but did not affect crop yields. The results of this study indicate that: (i) future strategies aimed at securing subtropical cereal production without increasing N2O emissions should focus on the fertilisation of the summer crop; (ii) adjusting conventional N fertiliser rates on estimated residual soil N is an effective practice to reduce N2O emissions but can lead to substantial yield losses if the residual soil N is not assessed correctly; (iii) the application of DMPP is a feasible strategy to reduce annual N2O emissions from sub-tropical wheat–maize rotations. However, at the N rates tested in this study DMPP urea did not increase crop yields, making it impossible to recoup extra costs associated with this fertiliser. The findings of this study will support farmers and policy makers to define effective fertilisation strategies to reduce N2O emissions from subtropical cereal cropping systems while maintaining high crop productivity. More research is needed to assess the use of DMPP urea in terms of reducing conventional N fertiliser rates and subsequently enable a decrease of fertilisation costs and a further abatement of fertiliser-induced N2O emissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To compare reproduction in extensively managed, tropically adapted beef cows that were either seropositive or seronegative to Neospora caninum. Design Longitudinal study of cows within management groups. Methods Compare pregnancy with weaning outcomes for 502 seropositive and 3255 seronegative cows in 25 management groups. Results We found N. caninum in all herds, with an average of 20% of 2640 tested animals seropositive within management group; prevalence varied between 0% and 94%. At 7 of 10 sites assessed, there was evidence of horizontal transmission of N. caninum. There was no overall difference in pregnancy rate (79% vs 75%; P > 0.05), reproductive wastage after confirmed pregnancy diagnosis (11% vs 10%; P > 0.05) or weaning rate (67% vs 68%; P > 0.05) between seronegative and seropositive cows, respectively. In one herd where a combination of risk factors for N. caninum was present, a significant reduction in pregnancy rate occurred after the 6 months mating (85% vs 69%; P < 0.05). The fetal and calf losses observed were lowest in south-east Queensland (4.3% of 117 pregnancies), highest in north-west Queensland (15.5% of 413 pregnancies) and intermediate in north-east Queensland (10.2% of 1625 pregnancies). Other infectious agents that are known to cause reproductive wastage were endemic in many herds, though none appeared to cause significant fetal or calf loss in this study. Conclusion Despite a high prevalence of N. caninum, there was no apparent effect on beef cattle reproduction, but there is potential to cause reproductive wastage if known risk factors to neosporosis are in effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alternaria leaf blotch and fruit spot caused by Alternaria spp. cause annual losses to the Australian apple industry. Control options are limited, mainly due to a lack of understanding of the disease cycle. Therefore, this study aimed to determine potential sources of Alternaria spp. inoculum in the orchard and examine their relative contribution throughout the production season. Leaf residue from the orchard floor, canopy leaves, twigs and buds were collected monthly from three apple orchards for two years and examined for the number of spores on their surface. In addition, the effects of climatic factors on spore production dynamics in each plant part were examined. Although all four plant parts tested contributed to the Alternaria inoculum in the orchard, significant higher numbers of spores were obtained from leaf residue than the other plant parts supporting the hypothesis that overwintering of Alternaria spp. occurred mainly in leaf residue and minimally on twigs and buds. The most significant period of spore production on leaf residue occurred from dormancy until bloom and on canopy leaves and twigs during the fruit growth stage. Temperature was the single most significant factor influencing the amount of Alternaria inoculum and rainfall and relative humidity showed strong associations with temperature influencing the spore production dynamics in Australian orchards. The practical implications of this study include the eradication of leaf residue from the orchard floor and sanitation of the canopy after harvest to remove residual spores from the trees.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alternaria leaf blotch and fruit spot of apple caused by Alternaria spp. cause annual losses to the Australian apple industry. Erratic control using protectant fungicides is often experienced and may be due to the lack of understanding of the timing of infection and epidemiology of the diseases. We found that Alternaria leaf blotch infection began about 20 days after bloom (DAB) and the highest disease incidence occurred from 70 to 110 DAB. Alternaria fruit spot infection occurred about 100 DAB in the orchard. Fruit inoculations in planta showed that there was no specific susceptible stage of fruit. Leaves and fruit in the lower canopy of trees showed higher levels of leaf blotch and fruit spot incidence than those in the upper canopy and the incidence of leaf blotch in shoot leaves was higher than in spur leaves. Temperature, relative humidity, and rainfall affected leaf blotch and fruit spot incidence. The gained knowledge on the timing of infection and development of disease may aid in the development of more effective disease management strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past decade, the Finnish agricultural sector has undergone rapid structural changes. The number of farms has decreased and the average farm size has increased when the number of farms transferred to new entrants has decreased. Part of the structural change in agriculture is manifested in early retirement programmes. In studying farmers exit behaviour in different countries, institutional differences, incentive programmes and constraints are found to matter. In Finland, farmers early retirement programmes were first introduced in 1974 and, during the last ten years, they have been carried out within the European Union framework for these programmes. The early retirement benefits are farmer specific and de-pend on the level of pension insurance the farmer has paid over his active farming years. In order to predict the future development of the agricultural sector, farmers have been frequently asked about their future plans and their plans for succession. However, the plans the farmers made for succession have been found to be time inconsistent. This study estimates the value of farmers stated succession plans in predicting revealed succession decisions. A stated succession plan exists when a farmer answers in a survey questionnaire that the farm is going to be transferred to a new entrant within a five-year period. The succession is revealed when the farm is transferred to a suc-cessor. Stated and revealed behaviour was estimated as a recursive Binomial Probit Model, which accounts for the censoring of the decision variables and controls for a potential correlation between the two equations. The results suggest that the succession plans, as stated by elderly farmers in the questionnaires, do not provide information that is significant and valuable in predicting true, com-pleted successions. Therefore, farmer exit should be analysed based on observed behaviour rather than on stated plans and intentions. As farm retirement plays a crucial role in determining the characteristics of structural change in agriculture, it is important to establish the factors which determine an exit from farming among eld-erly farmers and how off-farm income and income losses affect their exit choices. In this study, the observed choice of pension scheme by elderly farmers was analysed by a bivariate probit model. Despite some variations in significance and the effects of each factor, the ages of the farmer and spouse, the age and number of potential successors, farm size, income loss when retiring and the location of the farm together with the production line were found to be the most important determi-nants of early retirement and the transfer or closure of farms. Recently, the labour status of the spouse has been found to contribute significantly to individual retirement decisions. In this study, the effect of spousal retirement and economic incentives related to the timing of a farming couple s early retirement decision were analysed with a duration model. The results suggest that an expected pension in particular advances farm transfers. It was found that on farms operated by a couple, both early retirement and farm succession took place more often than on farms operated by a single person. However, the existence of a spouse delayed the timing of early retirement. Farming couples were found to co-ordinate their early retirement decisions when they both exit through agricultural retirement programmes, but such a co-ordination did not exist when one of the spouses retired under other pension schemes. Besides changes in the agricultural structure, the share and amount of off-farm income of a farm family s total income has also increased. In the study, the effect of off-farm income on farmers retirement decisions, in addition to other financial factors, was analysed. The unknown parameters were first estimated by a switching-type multivariate probit model and then by the simulated maxi-mum likelihood (SML) method, controlling for farmer specific fixed effects and serial correlation of the errors. The results suggest that elderly farmers off-farm income is a significant determinant in a farmer s choice to exit and close down the farm. However, off-farm income only has a short term effect on structural changes in agriculture since it does not significantly contribute to the timing of farm successions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Kernel brown centres in macadamia are a defect causing internal discolouration of kernels. This study investigates the effect on the incidence of brown centres in raw kernel after maintaining high moisture content in macadamia nuts-in-shell stored at temperatures of 30°C, 35°C, 40°C and 45°C. RESULTS Brown centres of raw kernel increased with nuts-in-shell storage time and temperature when high moisture content was maintained by sealing in polyethylene bags. Almost all kernels developed the defect when kept at high moisture content for 5 days at 45°C, and 44% developed brown centres after only 2 days of storage at high moisture content at 45°C. This contrasted with only 0.76% when stored for 2 days at 45°C but allowed to dry in open-mesh bags. At storage temperatures below 45°C, there were fewer brown centres, but there were still significant differences between those stored at high moisture content and those allowed to dry (P < 0.05). CONCLUSION Maintenance of high moisture content during macadamia nuts-in-shell storage increases the incidence of brown centres in raw kernels and the defect increases with time and temperature. On-farm nuts-in-shell drying and storage practices should rapidly remove moisture to reduce losses. Ideally, nuts-in-shell should not be stored at high moisture content on-farm at temperatures over 30°C. © 2013 Society of Chemical Industry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rhipicephalus (Boophilus) microplus (Acari: Ixodidae) ticks cause economic losses for cattle industries throughout tropical and subtropical regions of the world estimated at $US2.5 billion annually. Lack of access to efficacious long-lasting vaccination regimes and increases in tick acaricide resistance have led to the investigation of targets for the development of novel tick vaccines and treatments. In vitro tick feeding has been used for many tick species to study the effect of new acaricides on the transmission of tick-borne pathogens. Few studies have reported the use of in vitro feeding for functional genomic studies using RNA interference and/or the effect of specific anti-tick antibodies. In particular, in vitro feeding reports for the cattle tick are limited due to its relatively short hypostome. Previously published methods were further modified to broaden optimal tick sizes/weights, feeding sources including bovine and ovine serum, optimisation of commercially available blood anti-coagulant tubes, and IgG concentrations for effective antibody delivery. Ticks are fed overnight and monitored for ∼5–6 weeks to determine egg output and success of larval emergence using a humidified incubator. Lithium-heparin blood tubes provided the most reliable anti-coagulant for bovine blood feeding compared with commercial citrated (CPDA) and EDTA tubes. Although >30 mg semi-engorged ticks fed more reliably, ticks as small as 15 mg also fed to repletion to lay viable eggs. Ticks which gained less than ∼10 mg during in vitro feeding typically did not lay eggs. One mg/ml IgG from Bm86-vaccinated cattle produced a potent anti-tick effect in vitro (83% efficacy) similar to that observed in vivo. Alternatively, feeding of dsRNA targeting Bm86 did not demonstrate anti-tick effects (11% efficacy) compared with the potent effects of ubiquitin dsRNA. This study optimises R. microplus tick in vitro feeding methods which support the development of cattle tick vaccines and treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many banana producing regions around the world experience climate variability as a result of seasonal rainfall and temperature conditions, which result in sub-optimal conditions for banana production. This can create periods of plant stress which impact on plant growth, development and yields. Furthermore, diseases such as Fusarium wilt caused by Fusarium oxysporum f. sp. cubense, can become more predominant following periods of environmental stress, particularly for many culturally significant cultivars such as Ducasse (synonym Pisang Awak) (Musa ABB). The aim of this experiment was to determine if expression of symptoms of Fusarium wilt of bananas in a susceptible cultivar could be explained by environmental conditions, and if soil management could reduce the impact of the disease and increase production. An experiment was established in an abandoned commercial field of Ducasse bananas with a high incidence of Fusarium wilt. Vegetated ground cover was maintained around the base of banana plants and compared with plants grown in bare soil for changes in growth, production and disease symptoms. Expression of Fusarium wilt was found to be a function of water stress potential and the heat unit requirement for bananas. The inclusion of vegetative ground cover around the base of the banana plants significantly reduced the severity and incidence of Fusarium wilt by 20 % and altered the periods of symptom development. The growth of bananas and development of the bunch followed the accumulated heat units, with a greater number of bunched plants evident during warmer periods of the year. The weight of bunches harvested in a second crop cycle was increased when banana plants were grown in areas with vegetative ground cover, with fewer losses of plants due to Fusarium wilt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two field trials were conducted with untreated coconut wood (“cocowood”) of varying densities against the subterranean termites Coptotermes acinaciformis (Froggatt) and Mastotermes darwiniensis Froggatt in northern Queensland, Australia. Both trials ran for 16 weeks during the summer months. Cocowood densities ranged from 256 kg/m3 to 1003 kg/m3, and the test specimens were equally divided between the two termite trial sites. Termite pressure was high at both sites where mean mass losses in the Scots pine sapwood feeder specimens were: 100% for C. acinaciformis and 74.7% for M. darwiniensis. Termite species and cocowood density effects were significant. Container and position effects were not significant. Mastotermes darwiniensis fed more on the cocowood than did C. acinaciformis despite consuming less of the Scots pine than did C. acinaciformis. Overall the susceptibility of cocowood to C. acinaciformis and M. darwiniensis decreases with increasing density, but all densities (apart from a few at the high end of the density range) could be considered susceptible, particularly to M. darwiniensis. Some deviations from this general trend are discussed as well as implications for the utilisation of cocowood as a building resource.