857 resultados para High Risk


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study aimed to examine the incidence of young adult-onset T1DM and T2DM among Finns, and to explore the possible risk factors for young adult-onset T1DM and T2DM that occur during the perinatal period and childhood. In the studies I-II, the incidence of diabetes was examined among 15-39-year-old Finns during the years 1992-2001. Information on the new diagnoses of diabetes was collected from four sources: standardized national reports filled in by diabetes nurses, the Hospital Discharge Register, the Drug Reimbursement Register, and the Drug Prescription Register. The type of diabetes was assigned using information obtained from these four data sources. The incidence of T1DM was 18 per 100,000/year, and there was a clear male predominance in the incidence of T1DM. The incidence of T1DM increased on average 3.9% per year during 1992-2001. The incidence of T2DM was 13 per 100,000/year, and it displayed an increase of 4.3% per year. In the studies III-V, the effects of perinatal exposures and childhood growth on the risk for young adult-onset T1DM and T2DM were explored in a case-control setting. Individuals diagnosed with T1DM (n=1,388) and T2DM (n=1,121) during the period 1992-1996 were chosen as the diabetes cases for the study, and two controls were chosen for each case from the National Population Register. Data on the study subjects parents and siblings was obtained from the National Population Register. The study subjects original birth records and child welfare clinic records were traced nationwide. The risk for young adult-onset T2DM was the lowest among the offspring of mothers aged about 30 years, whereas the risk for T2DM increased towards younger and older maternal ages. Birth orders second to fourth were found protective of T2DM. In addition, the risk for T2DM was observed to decrease with increasing birth weight until 4.2 kg, after which the risk began to increase. A high body mass index (BMI) at the BMI rebound between ages 3-11 years substantially increased the risk for T2DM, and the excess weight gain in individuals diagnosed with T2DM began in early childhood. Maternal age, birth order, or body size at birth had no effect on the risk for young adult-onset T1DM. Instead, individuals with T1DM were observed to have a higher maximum BMI before the age of 3 than their control subjects. In conclusion, the increasing trend in the development of both T1DM and T2DM among young Finnish adults is alarming. The high risk for T1DM among the Finnish population extends to at least 40 years of age, and at least 200-300 young Finnish adults are diagnosed with T2DM every year. Growth during the fetal period and childhood notably affects the risk for T2DM. T2DM prevention should also target childhood obesity. Rapid growth during the first years of life may be a risk factor for late-onset T1DM.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The introduction of glyphosate tolerant cotton has significantly improved the flexibility and management of a number of problem weeds in cotton systems. However, reliance on glyphosate poses risks to the industry in term of glyphosate resistance and species shift. The aims of this project were to identify these risks, and determine strategies to prevent and mitigate the potential for resistance evolution. Field surveys identified fleabane as the most common weed now in both irrigated and dryland system. Sowthistle has also increased in prevalence, and bladder ketmia and peachvine remained common. The continued reliance on glyphosate has favoured small seeded, and glyphosate tolerant species. Fleabane is both of these, with populations confirmed resistant in grains systems in Queensland and NSW. When species were assessed for their resistance risk, fleabane, liverseed grass, feathertop Rhodes grass, sowthistle and barnyard grass were determined to have high risk ratings. Management practices were also determined to rely heavily on glyphosate and therefore be high risk in summer fallows, and dryland glyphosate tolerant and conventional cotton. Situations were these high risk species are present in high risk cropping phases need particular attention. The confirmation of a glyphosate resistance barnyard grass population in a dryland glyphosate tolerant cotton system means resistance is now a reality for the cotton industry. However, experiments have shown that resistant populations can be managed with other herbicide options currently available. However, the options for fleabane management in cotton are still limited. Although some selective residual herbicides are showing promise, the majority of fleabane control tactics can only be used in other phases of the cotton rotation. An online glyphosate resistance tool has been developed. This tool allows growers to assess their individual glyphosate resistance risks, and how they can adjust their practices to reduce their risks. It also provides researchers with current information on weed species present and practices used across the industry. This tool will be extremely useful in tailoring future research and extension efforts. Simulations from the expanded glyphosate resistance model have shown that glyphosate resistance can be prevented and managed in glyphosate-tolerant cotton farming systems. However, for strategies to be successful, some effort is required. Simulations have shown the importance of controlling survivors of glyphosate applications, using effective glyphosate alternatives in fallows, and combining several effective glyphosate alternatives in crop, and these are the key to the prevention and management of glyphosate resistance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Fitzroy Basin is the second largest catchment area in Australia covering 143,00 km² and is the largest catchment for the Great Barrier Reef lagoon (Karfs et al., 2009). The Great Barrier Reef is the largest reef system in the world; it covers an area of approximately 225,000 km² in the northern Queensland continental shelf. There are approximately 750 reefs that exist within 40 km of the Queensland Coast (Haynes et al., 2007). The prime determinant for the changes in water quality have been attributed to grazing, with beef production the largest single land use industry comprising 90% of the land area (Karfs et al., 2009). In response to the depletion of water quality in the reef, in 2003 a Reef Water Quality plan was developed by the Australian and Queensland governments. The plan targets as a priority sediment contributions from grazing cattle in high risk catchments (The State of Queensland and Commonwealth of Australia, 2003). The economic incentive strategy designed includes analysing the costs and benefits of best management practice that will lead to improved water quality (The State of Queensland and Commonwealth of Australia, 2003).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hereditary nonpolyposis colorectal cancer (HNPCC) and familial adenomatous polyposis (FAP) are characterized by a high risk and early onset of colorectal cancer (CRC). HNPCC is due to a germline mutation in one of the following MMR genes: MLH1, MSH2, MSH6 and PMS2. A majority of FAP and attenuated FAP (AFAP) cases are due to germline mutations of APC, causing the development of multiple colorectal polyps. To date, over 450 MMR gene mutations and over 800 APC mutations have been identified. Most of these mutations lead to a truncated protein, easily detected by conventional mutation detection methods. However, in about 30% of HNPCC and FAP, and about 90% of AFAP families, mutations remain unknown. We aimed to clarify the genetic basis and genotype-phenotype correlation of mutation negative HNPCC and FAP/AFAP families by advanced mutation detection methods designed to detect large genomic rearrangements, mRNA and protein expression alterations, promoter mutations, phenotype linked haplotypes, and tumoral loss of heterozygosity. We also aimed to estimate the frequency of HNPCC in Uruguayan CRC patients. Our expression based analysis of mutation negative HNPCC divided these families into two categories: 1) 42% of families linked to the MMR genes with a phenotype resembling that of mutation positive, and 2) 58% of families likely to be associated with other susceptibility genes. Unbalanced mRNA expression of MLH1 was observed in two families. Further studies revealed that a MLH1 nonsense mutation, R100X was associated with aberrant splicing of exons not related to the mutation and an MLH1 deletion (AGAA) at nucleotide 210 was associated with multiple exon skipping, without an overall increase in the frequency of splice events. APC mutation negative FAP/AFAP families were divided into four groups according to the genetic basis of their predisposition. Four (14%) families displayed a constitutional deletion of APC with profuse polyposis, early age of onset and frequent extracolonic manifestations. Aberrant mRNA expression of one allele was observed in seven (24%) families with later onset and less frequent extracolonic manifestations. In 15 (52%) families the involvement of APC could neither be confirmed nor excluded. In three (10%) of the families a germline mutation was detected in genes other than APC: AXIN2 in one family, and MYH in two families. The families with undefined genetic basis and especially those with AXIN2 or MYH mutations frequently displayed AFAP or atypical polyposis. Of the Uruguayan CRC patients, 2.6% (12/461) fulfilled the diagnostic criteria for HNPCC and 5.6% (26/461) were associated with increased risk of cancer. Unexpectedly low frequency of molecularly defined HNPCC cases may suggest a different genetic profile in the Uruguayan population and the involvement of novel susceptibility genes. Accurate genetic and clinical characterization of families with hereditary colorectal cancers, and the definition of the genetic basis of "mutation negative" families in particular, facilitate proper clinical management of such families.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Converting from an existing irrigation system is often seen as high risk by the land owner. The significant financial investment and the long period over which the investment runs is also complicated by the uncertainty associated with long term input costs (such as energy), crop production, and the continually evolving natural resource management rules and policy. Irrigation plays a pivotal part in the Burdekin sugarcane farming system. At present the use of furrow irrigation is by far the most common form due to the ease of use, relatively low operating cost and well established infrastructure currently in place. The Mulgrave Area Farmer Integrated Action (MAFIA) grower group, located near Clare in the lower Burdekin region, identified the need to learn about sustainable farming systems with a focus on the environment, social and economic implications. In early 2007, Hesp Faming established a site to investigate the use of overhead irrigation as an alternative to furrow irrigation and its integration with new farming system practices, including Green Cane Trash Blanketing (GCTB). Although significant environmental and social benefits exist, the preliminary investment analysis indicates that the Overhead Low Pressure (OHLP) irrigation system is not adding financial value to the Hesp Farming business. A combination of high capital costs and other offsetting factors resulted in the benefits not being fully realised. A different outcome is achieved if Hesp Farming is able to realise value on the water saved, with both OHLP irrigation systems displaying a positive NPV. This case study provides a framework to further investigate the economics of OHLP irrigation in sugarcane and it is anticipated that with additional data a more definitive outcome will be developed in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding the host range for all of the fruit fly species within the South Pacific region is vital to establishing trade and quarantine protocols. This is important for the countries within the region and their trade partners. A significant aspect of the Australian Centre for International Agricultural Research (ACIAR) and Regional Fruit Fly Projects (RFFP) has been host fruit collecting which has provided information on fruit fly host records in the seven participating countries. This work is still continuing in all project countries at different intensities. In the Cook Islands, Fiji, Tonga and Western Samoa, fruit surveys have assumed a quarantine surveillance role, with a focus on high risk fruits, such as guava, mango, citrus, bananas, cucurbits and solanaceous fruits. In the Solomon Islands, Vanuatu and the Federated States of Micronesia (FSM), fruit surveys are still at the stage where host ranges are far from complete. By the end of the current project a more complete picture of the fruit fly hosts in these countries will have been gained. A brief summary of the data collected to date is as follows: 23 947 fruit samples collected to date; 2181 positive host fruit records; 31 fruit fly species reared from fruit; 12 species reared from commercial fruit. A commercial fruit is classed as an edible fruit with potential for trade at either a local or international level. This allows for the inclusion of endemic fruit species that have cultural significance as a food source. On the basis of these results, there are fruit fly species of major economic importance in the South Pacific region. However, considerably more fruit survey work is required in order to establish a detailed understanding of all the pest species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The ATM gene encoding a putative protein kinase is mutated in ataxia-telangiectasia (A-T), an autosomal recessive disorder with a predisposition for cancer. Studies of A-T families suggest that female heterozygotes have an increased risk of breast cancer compared with noncarriers. However, neither linkage analyses nor mutation studies have provided supporting evidence for a role of ATM in breast cancer predisposition. Nevertheless, two recurrent ATM mutations, T7271G and IVS10-6T-->G, reportedly increase the risk of breast cancer. We examined these two ATM mutations in a population-based, case-control series of breast cancer families and multiple-case breast cancer families. METHODS: Five hundred twenty-five or 262 case patients with breast cancer and 381 or 68 control subjects, respectively, were genotyped for the T7271G and IVS10-6T-->G ATM mutations, as were index patients from 76 non-BRCA1/2 multiple-case breast cancer families. Linkage and penetrance were analyzed. ATM protein expression and kinase activity were analyzed in lymphoblastoid cell lines from mutation carriers. All statistical tests were two-sided. RESULTS: In case and control subjects unselected for family history of breast cancer, one case patient had the T7271G mutation, and none had the IVS10-6T-->G mutation. In three multiple-case families, one of these two mutations segregated with breast cancer. The estimated average penetrance of the mutations was 60% (95% confidence interval [CI] = 32% to 90%) to age 70 years, equivalent to a 15.7-fold (95% CI = 6.4-fold to 38.0-fold) increased relative risk compared with that of the general population. Expression and activity analyses of ATM in heterozygous cell lines indicated that both mutations are dominant negative. CONCLUSION: At least two ATM mutations are associated with a sufficiently high risk of breast cancer to be found in multiple-case breast cancer families. Full mutation analysis of the ATM gene in such families could help clarify the role of ATM in breast cancer susceptibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: We sought to characterise the demographics, length of admission, final diagnoses, long-term outcome and costs associated with the population who presented to an Australian emergency department (ED) with symptoms of possible acute coronary syndrome (ACS). Design, setting and participants: Prospectively collected data on ED patients presenting with suspected ACS between November 2008 and February 2011 was used, including data on presentation and at 30 days after presentation. Information on patient disposition, length of stay and costs incurred was extracted from hospital administration records. Main outcome measures: Primary outcomes were mean and median cost and length of hospital stay. Secondary outcomes were diagnosis of ACS, other cardiovascular conditions or non-cardiovascular conditions within 30 days of presentation. Results: An ACS was diagnosed in 103 (11.1%) of the 926 patients recruited. 193 patients (20.8%) were diagnosed with other cardiovascular-related conditions and 622 patients (67.2%) had non-cardiac-related chest pain. ACS events occurred in 0 and 11 (1.9%) of the low-risk and intermediate-risk groups, respectively. Ninety-two (28.0%) of the 329 high-risk patients had an ACS event. Patients with a proven ACS, high-grade atrioventricular block, pulmonary embolism and other respiratory conditions had the longest length of stay. The mean cost was highest in the ACS group ($13 509; 95% CI, $11 794–$15 223) followed by other cardiovascular conditions ($7283; 95% CI, $6152–$8415) and non-cardiovascular conditions ($3331; 95% CI, $2976–$3685). Conclusions: Most ED patients with symptoms of possible ACS do not have a cardiac cause for their presentation. The current guideline-based process of assessment is lengthy, costly and consumes significant resources. Investigation of strategies to shorten this process or reduce the need for objective cardiac testing in patients at intermediate risk according to the National Heart Foundation and Cardiac Society of Australia and New Zealand guideline is required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multiple Trichinella species are reported from the Australasian region although mainland Australia has never confirmed an indigenous case of Trichinella infection in humans or animals. Wildlife surveys in high-risk regions are essential to truly determine the presence or absence of Trichinella, but in mainland Australia are largely lacking. In this study, a survey was conducted in wild pigs from mainland Australia's Cape York Peninsula and Torres Strait region for the presence of Trichinella, given the proximity of a Trichinella papuae reservoir in nearby PNG. We report the detection of a Trichinella infection in a pig from an Australian island in the Torres Strait, a narrow waterway that separates the islands of New Guinea and continental Australia. The larvae were characterised as T. papuae (Kikori strain) by PCR and sequence analysis. No Trichinella parasites were found in any pigs from the Cape York Peninsula. These results highlight the link the Torres Strait may play in providing a passage for introduction of Trichinella parasites from the Australasian region to the Australian mainland. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

- Objective This study examined chronic disease risks and the use of a smartphone activity tracking application during an intervention in Australian truck drivers (April-October 2014). - Methods Forty-four men (mean age=47.5 [SD 9.8] years) completed baseline health measures, and were subsequently offered access to a free wrist-worn activity tracker and smartphone application (Jawbone UP) to monitor step counts and dietary choices during a 20-week intervention. Chronic disease risks were evaluated against guidelines; weekly step count and dietary logs registered by drivers in the application were analysed to evaluate use of the Jawbone UP. - Results Chronic disease risks were high (e.g. 97% high waist circumference [≥94 cm]). Eighteen drivers (41%) did not start the intervention; smartphone technical barriers were the main reason for drop out. Across 20-weeks, drivers who used the Jawbone UP logged step counts for an average of 6 [SD 1] days/week; mean step counts remained consistent across the intervention (weeks 1–4=8,743[SD 2,867] steps/day; weeks 17–20=8,994[SD 3,478] steps/day). The median number of dietary logs significantly decreased from start (17 [IQR 38] logs/weeks) to end of the intervention (0 [IQR 23] logs/week; p<0.01); the median proportion of healthy diet choices relative to total diet choices logged increased across the intervention (weeks 1–4=38[IQR 21]%; weeks 17–20=58[IQR 18]%). - Conclusions Step counts were more successfully monitored than dietary choices in those drivers who used the Jawbone UP. - Implications Smartphone technology facilitated active living and healthy dietary choices, but also prohibited intervention engagement in a number of these high-risk Australian truck drivers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Grasses, legumes, saltbushes and herbs were evaluated at 6 sites in southern inland Queensland to identify potential pasture and forage plants for use on marginal cropping soils. The region experiences summer heat waves and severe winter frosts. Emphasis was on perennial plants, and native species were included. Seedlings were transplanted into the unfertilized fields in either summer or autumn to suit the growing season of plants, and watered to ensure estab-lishment. Summer-growing grasses were the most successful group, while cool season-growing perennials mostly failed. Summer legumes were disappointing, with Stylosanthes scabra and Indigofera schimperi performing best. Some lines such as I. schimperi and the Eragrostis hybrid cv. Cochise were assessed as potential weeds owing to low animal acceptance. Native Rhynchosia minima grew well at some sites and deserves more study. Cenchrus ciliaris was always easy to establish and produced the highest yields. Persistence of some Digitaria and Bothriochloa species, Eragrostis curvula and Fingerhuthia africana at specific sites was encouraging, but potential weediness needs careful assessment. Standard species were identified to represent the main forage types, such as Austrostipa scabra for cool season-growing grasses, for incorporation into future trials with new genetic materials. The early field testing protocol used should be considered for use elsewhere, if unreliable rainfall poses a high risk of establishment failure from scarce seed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lethal control of wild dogs - that is Dingo (Canis lupus dingo) and Dingo/Dog (Canis lupus familiaris) hybrids - to reduce livestock predation in Australian rangelands is claimed to cause continental-scale impacts on biodiversity. Although top predator populations may recover numerically after baiting, they are predicted to be functionally different and incapable of fulfilling critical ecological roles. This study reports the impact of baiting programmes on wild dog abundance, age structures and the prey of wild dogs during large-scale manipulative experiments. Wild dog relative abundance almost always decreased after baiting, but reductions were variable and short-lived unless the prior baiting programme was particularly effective or there were follow-up baiting programmes within a few months. However, age structures of wild dogs in baited and nil-treatment areas were demonstrably different, and prey populations did diverge relative to nil-treatment areas. Re-analysed observations of wild dogs preying on kangaroos from a separate study show that successful chases that result in attacks of kangaroos by wild dogs occurred when mean wild dog ages were higher and mean group size was larger. It is likely that the impact of lethal control on wild dog numbers, group sizes and age structures compromise their ability to handle large difficult-to-catch prey. Under certain circumstances, these changes sometimes lead to increased calf loss (Bos indicus/B. taurus genotypes) and kangaroo numbers. Rangeland beef producers could consider controlling wild dogs in high-risk periods when predation is more likely and avoid baiting at other times.