1000 resultados para Assessment roll


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To determine the association between conjunctival goblet cell density (GCD) assessed using in vivo laser scanning confocal microscopy and conjunctival impression cytology in a healthy population. Methods Ninety (90) healthy participants undertook a validated 5-item dry eye questionnaire, non-invasive tear film break-up time measurement, ocular surface fluorescein staining and phenol red thread test. These tests where undertaken to diagnose and exclude participants with dry eye. The nasal bulbar conjunctiva was imaged using laser scanning confocal microscopy (LSCM). Conjunctival impression cytology (CIC) was performed in the same region a few minutes later. Conjunctival goblet cell density was calculated as cells/mm2. Results There was a strong positive correlation of conjunctival GCD between LSCM and CIC (ρ = 0.66). Conjunctival goblet cell density was 475 ± 41 cells/mm2 and 466 ± 51 cells/mm2 measured by LSCM and CIC, respectively. Conclusions The strong association between in vivo and in vitro cellular analysis for measuring conjunctival GCD suggests that the more invasive CIC can be replaced by the less invasive LSCM in research and clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The future use of genetically modified (GM) plants in food, feed and biomass production requires a careful consideration of possible risks related to the unintended spread of trangenes into new habitats. This may occur via introgression of the transgene to conventional genotypes, due to cross-pollination, and via the invasion of GM plants to new habitats. Assessment of possible environmental impacts of GM plants requires estimation of the level of gene flow from a GM population. Furthermore, management measures for reducing gene flow from GM populations are needed in order to prevent possible unwanted effects of transgenes on ecosystems. This work develops modeling tools for estimating gene flow from GM plant populations in boreal environments and for investigating the mechanisms of the gene flow process. To describe spatial dimensions of the gene flow, dispersal models are developed for the local and regional scale spread of pollen grains and seeds, with special emphasis on wind dispersal. This study provides tools for describing cross-pollination between GM and conventional populations and for estimating the levels of transgenic contamination of the conventional crops. For perennial populations, a modeling framework describing the dynamics of plants and genotypes is developed, in order to estimate the gene flow process over a sequence of years. The dispersal of airborne pollen and seeds cannot be easily controlled, and small amounts of these particles are likely to disperse over long distances. Wind dispersal processes are highly stochastic due to variation in atmospheric conditions, so that there may be considerable variation between individual dispersal patterns. This, in turn, is reflected to the large amount of variation in annual levels of cross-pollination between GM and conventional populations. Even though land-use practices have effects on the average levels of cross-pollination between GM and conventional fields, the level of transgenic contamination of a conventional crop remains highly stochastic. The demographic effects of a transgene have impacts on the establishment of trangenic plants amongst conventional genotypes of the same species. If the transgene gives a plant a considerable fitness advantage in comparison to conventional genotypes, the spread of transgenes to conventional population can be strongly increased. In such cases, dominance of the transgene considerably increases gene flow from GM to conventional populations, due to the enhanced fitness of heterozygous hybrids. The fitness of GM plants in conventional populations can be reduced by linking the selectively favoured primary transgene to a disfavoured mitigation transgene. Recombination between these transgenes is a major risk related to this technique, especially because it tends to take place amongst the conventional genotypes and thus promotes the establishment of invasive transgenic plants in conventional populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Puccinia psidii, the causal agent of myrtle rust, was first recorded from Latin America more than 100 years ago. It occurs on many native species of Myrtaceae in Latin America and also infects non-native plantation-grown Eucalyptus species in the region. The pathogen has gradually spread to new areas including Australia and most recently South Africa. The aim of this study was to consider the susceptibility of selected Eucalyptus genotypes, particularly those of interest to South African forestry, to infection by P. psidii. In addition, risk maps were compiled based on suitable climatic conditions and the occurrence of potential susceptible tree species. This made it possible to identify the season when P. psidii would be most likely to infect and to define the geographic areas where the rust disease would be most likely to establish in South Africa. As expected, variation in susceptibility was observed between eucalypt genotypes tested. Importantly, species commonly planted in South Africa show good potential for yielding disease-tolerant material for future planting. Myrtle rust is predicted to be more common in spring and summer. Coastal areas, as well as areas in South Africa with subtropical climates, are more conducive to outbreaks of the pathogen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four species of large mackerels (Scomberomorus spp.) co-occur in the waters off northern Australia and are important to fisheries in the region. State fisheries agencies monitor these species for fisheries assessment; however, data inaccuracies may exist due to difficulties with identification of these closely related species, particularly when specimens are incomplete from fish processing. This study examined the efficacy of using otolith morphometrics to differentiate and predict among the four mackerel species off northeastern Australia. Seven otolith measurements and five shape indices were recorded from 555 mackerel specimens. Multivariate modelling including linear discriminant analysis (LDA) and support vector machines, successfully differentiated among the four species based on otolith morphometrics. Cross validation determined a predictive accuracy of at least 96% for both models. An optimum predictive model for the four mackerel species was an LDA model that included fork length, feret length, feret width, perimeter, area, roundness, form factor and rectangularity as explanatory variables. This analysis may improve the accuracy of fisheries monitoring, the estimates based on this monitoring (i.e. mortality rate) and the overall management of mackerel species in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The National Energy Efficient Building Project (NEEBP) Phase One report, published in December 2014, investigated “process issues and systemic failures” in the administration of the energy performance requirements in the National Construction Code. It found that most stakeholders believed that under-compliance with these requirements is widespread across Australia, with similar issues being reported in all states and territories. The report found that many different factors were contributing to this outcome and, as a result, many recommendations were offered that together would be expected to remedy the systemic issues reported. To follow up on this Phase 1 report, three additional projects were commissioned as part of Phase 2 of the overall NEEBP project. This Report deals with the development and piloting of an Electronic Building Passport (EBP) tool – a project undertaken jointly by pitt&sherry and a team at the Queensland University of Technology (QUT) led by Dr Wendy Miller. The other Phase 2 projects cover audits of Class 1 buildings and issues relating to building alterations and additions. The passport concept aims to provide all stakeholders with (controlled) access to the key documentation and information that they need to verify the energy performance of buildings. This trial project deals with residential buildings but in principle could apply to any building type. Nine councils were recruited to help develop and test a pilot electronic building passport tool. The participation of these councils – across all states – enabled an assessment of the extent to which these councils are currently utilising documentation; to track the compliance of residential buildings with the energy performance requirements in the National Construction Code (NCC). Overall we found that none of the participating councils are currently compiling all of the energy performance-related documentation that would demonstrate code compliance. The key reasons for this include: a major lack of clarity on precisely what documentation should be collected; cost and budget pressures; low public/stakeholder demand for the documentation; and a pragmatic judgement that non-compliance with any regulated documentation requirements represents a relatively low risk for them. Some councils reported producing documentation, such as certificates of final completion, only on demand, for example. Only three of the nine council participants reported regularly conducting compliance assessments or audits utilising this documentation and/or inspections. Overall we formed the view that documentation and information tracking processes operating within the building standards and compliance system are not working to assure compliance with the Code’s energy performance requirements. In other words the Code, and its implementation under state and territory regulatory processes, is falling short as a ‘quality assurance’ system for consumers. As a result it is likely that the new housing stock is under-performing relative to policy expectations, consuming unnecessary amounts of energy, imposing unnecessarily high energy bills on occupants, and generating unnecessary greenhouse gas emissions. At the same time, Councils noted that the demand for documentation relating to building energy performance was low. All the participant councils in the EBP pilot agreed that documentation and information processes need to work more effectively if the potential regulatory and market drivers towards energy efficient homes are to be harnessed. These findings are fully consistent with the Phase 1 NEEBP report. It was also agreed that an EBP system could potentially play an important role in improving documentation and information processes. However, only one of the participant councils indicated that they might adopt such a system on a voluntary basis. The majority felt that such a system would only be taken up if it were: - A nationally agreed system, imposed as a mandatory requirement under state or national regulation; - Capable of being used by multiple parties including councils, private certifiers, building regulators, builders and energy assessors in particular; and - Fully integrated into their existing document management systems, or at least seamlessly compatible rather than a separate, unlinked tool. Further, we note that the value of an EBP in capturing statistical information relating to the energy performance of buildings would be much greater if an EBP were adopted on a nationally consistent basis. Councils were clear that a key impediment to the take up of an EBP system is that they are facing very considerable budget and staffing challenges. They report that they are often unable to meet all community demands from the resources available to them. Therefore they are unlikely to provide resources to support the roll out of an EBP system on a voluntary basis. Overall, we conclude from this pilot that the public good would be well served if the Australian, state and territory governments continued to develop and implement an Electronic Building Passport system in a cost-efficient and effective manner. This development should occur with detailed input from building regulators, the Australian Building Codes Board (ABCB), councils and private certifiers in the first instance. This report provides a suite of recommendations (Section 7.2) designed to advance the development and guide the implementation of a national EBP system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite being commonly prevalent in acute care hospitals worldwide, malnutrition often goes unidentified and untreated due to a lack in the implementation of a nutrition care pathway. The aim of this study was to validate nutrition screening and assessment tools in Vietnamese language. After converting into Vietnamese, Malnutrition Screening Tool (MST) and Subjective Global Assessment (SGA) were used to identify malnutrition in the adult setting; and the Paediatric Nutrition Screening Tool (PNST) and paediatric Subjective Global Nutritional Assessment (SGNA) were used in the paediatric setting in two acute care hospitals in Vietnam. This cross-sectional observational study sampled 123 adults (median age 78 years [39–96 years], 63% males) and 105 children (median age 20 months [2–100 months], 66% males). In adults, nutrition risk and malnutrition were identified in 29% and 45% of the cohort respectively. Nutrition risk and malnutrition were identified in 71% and 43% of the paediatric cohort respectively. The sensitivity and specificity of the screening tools were: 62% and 99% for the MST compared to the SGA; 89% and 42% for the PNST compared to the SGNA. This study provides a stepping stone to the potential use of evidence-based nutrition screening and assessment tools in Vietnamese language within the adult and paediatric Vietnamese acute care setting. Further work is required into integrating a complete nutrition care pathway within the acute care setting in Vietnamese hospitals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Skin temperature assessment is a promising modality for early detection of diabetic foot problems, but its diagnostic value has not been studied. Our aims were to investigate the diagnostic value of different cutoff skin temperature values for detecting diabetes-related foot complications such as ulceration, infection, and Charcot foot and to determine urgency of treatment in case of diagnosed infection or a red-hot swollen foot. Materials and Methods The plantar foot surfaces of 54 patients with diabetes visiting the outpatient foot clinic were imaged with an infrared camera. Nine patients had complications requiring immediate treatment, 25 patients had complications requiring non-immediate treatment, and 20 patients had no complications requiring treatment. Average pixel temperature was calculated for six predefined spots and for the whole foot. We calculated the area under the receiver operating characteristic curve for different cutoff skin temperature values using clinical assessment as reference and defined the sensitivity and specificity for the most optimal cutoff temperature value. Mean temperature difference between feet was analyzed using the Kruskal–Wallis tests. Results The most optimal cutoff skin temperature value for detection of diabetes-related foot complications was a 2.2°C difference between contralateral spots (sensitivity, 76%; specificity, 40%). The most optimal cutoff skin temperature value for determining urgency of treatment was a 1.35°C difference between the mean temperature of the left and right foot (sensitivity, 89%; specificity, 78%). Conclusions Detection of diabetes-related foot complications based on local skin temperature assessment is hindered by low diagnostic values. Mean temperature difference between two feet may be an adequate marker for determining urgency of treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Children with developmental co-ordination disorder (DCD) face evident motor difficulties in activities of daily living (ADL). Assessment of their capacity in ADL is essential for diagnosis and intervention, in order to limit the daily consequences of the disorder. The aim of this study is to systematically review potential instruments for standardized and objective assessment of children's capacity in ADL, suited for children with DCD. As a first step, databases of MEDLINE, EMBASE, CINAHL and PsycINFO were searched to identify studies that described instruments with potential for assessment of capacity in ADL. Second, instruments were included for review when two independent reviewers agreed that the instruments: (1) are standardized and objective; (2) assess at activity level and comprise items that reflect ADL, and; (3) are applicable to school-aged children that can move independently. Out of 1507 publications, 66 publications were selected, describing 39 instruments. Seven of these instruments were found to fulfil the criteria and were included for review: the Bruininks-Oseretsky Test of Motor Performance-2 (BOT2); the Do-Eat (Do-Eat); the Movement Assessment Battery for Children-2 (MABC2); the school-Assessment of Motor and Process Skills (schoolAMPS); the Tuffts Assessment of Motor Performance (TAMP); the Test of Gross Motor Development (TGMD); and the Functional Independence Measure for Children (WeeFIM). As a third step, for the included instruments, suitability for children with DCD was discussed based on the ADL comprised, ecological validity and other psychometric properties. We concluded that current instruments do not provide comprehensive and ecologically valid assessment of capacity in ADL as required for children with DCD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To develop the DCDDaily, an instrument for objective and standardized clinical assessment of capacity in activities of daily living (ADL) in children with developmental coordination disorder (DCD), and to investigate its usability, reliability, and validity. Subjects Five to eight-year-old children with and without DCD. Main measures The DCDDaily was developed based on thorough review of the literature and extensive expert involvement. To investigate the usability (assessment time and feasibility), reliability (internal consistency and repeatability), and validity (concurrent and discriminant validity) of the DCDDaily, children were assessed with the DCDDaily and the Movement Assessment Battery for Children-2 Test, and their parents filled in the Movement Assessment Battery for Children-2 Checklist and Developmental Coordination Disorder Questionnaire. Results 459 children were assessed (DCD group, n = 55; normative reference group, n = 404). Assessment was possible within 30 minutes and in any clinical setting. For internal consistency, Cronbach’s α = 0.83. Intraclass correlation = 0.87 for test–retest reliability and 0.89 for inter-rater reliability. Concurrent correlations with Movement Assessment Battery for Children-2 Test and questionnaires were ρ = −0.494, 0.239, and −0.284, p < 0.001. Discriminant validity measures showed significantly worse performance in the DCD group than in the control group (mean (SD) score 33 (5.6) versus 26 (4.3), p < 0.001). The area under curve characteristic = 0.872, sensitivity and specificity were 80%. Conclusions The DCDDaily is a valid and reliable instrument for clinical assessment of capacity in ADL, that is feasible for use in clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Patients with diabetic foot disease require frequent screening to prevent complications and may be helped through telemedical home monitoring. Within this context, the goal was to determine the validity and reliability of assessing diabetic foot infection using photographic foot imaging and infrared thermography. Subjects and Methods For 38 patients with diabetes who presented with a foot infection or were admitted to the hospital with a foot-related complication, photographs of the plantar foot surface using a photographic imaging device and temperature data from six plantar regions using an infrared thermometer were obtained. A temperature difference between feet of > 2.2 °C defined a ''hotspot.'' Two independent observers assessed each foot for presence of foot infection, both live (using the Perfusion-Extent-Depth- Infection-Sensation classification) and from photographs 2 and 4 weeks later (for presence of erythema and ulcers). Agreement in diagnosis between live assessment and (the combination of ) photographic assessment and temperature recordings was calculated. Results Diagnosis of infection from photographs was specific (> 85%) but not very sensitive (< 60%). Diagnosis based on hotspots present was sensitive (> 90%) but not very specific (<25%). Diagnosis based on the combination of photographic and temperature assessments was both sensitive (> 60%) and specific (> 79%). Intra-observer agreement between photographic assessments was good (Cohen's j = 0.77 and 0.52 for both observers). Conclusions Diagnosis of foot infection in patients with diabetes seems valid and reliable using photographic imaging in combination with infrared thermography. This supports the intended use of these modalities for the home monitoring of high-risk patients with diabetes to facilitate early diagnosis of signs of foot infection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To develop and test a custom-built instrument to simultaneously assess tear film surface quality (TFSQ) and subjective vision score (SVS).