941 resultados para Project method
Resumo:
An abundant literature has demonstrated the benefits of empathy for intergroup relations (e.g., Batson, Chang, Orr, & Rowland, 2002). In addition, empathy has been identified as the mechanism by which various successful prejudice-reduction procedures impact attitudes and behaviour (e.g., Costello & Hodson, 2010). However, standard explicit techniques used in empathy-prejudice research have a number of potential limitations (e.g., resistance; McGregor, 1993). The present project explored an alternative technique, subliminally priming (i.e., outside of awareness) empathy-relevant terms (Study 1), or empathy itself (Study 2). Study 1 compared the effects of exposure to subliminal empathy-relevant primes (e.g., compassion) versus no priming and priming the opposite of empathy (e.g., indifference) on prejudice (i.e., negative attitudes), discrimination (i.e., resource allocation), and helping behaviour (i.e., willingness to empower, directly assist, or expect group change) towards immigrants. Relative to priming the opposite of empathy, participants exposed to primes of empathy-relevant constructs expressed less prejudice and were more willingness to empower immigrants. In addition, the effects were not moderated by individual differences in prejudice-relevant variables (i.e., Disgust Sensitivity, Intergroup Disgust-Sensitivity, Intergroup Anxiety, Social Dominance Orientation, Right-wing Authoritarianism). Study 2 considered a different target category (i.e., Blacks) and attempted to strengthen the effects found by comparing the impact of subliminal empathy primes (relative to no prime or subliminal primes of empathy paired with Blacks) on explicit prejudice towards marginalized groups and Blacks, willingness to help marginalized groups and Blacks, as well as implicit prejudice towards Blacks. In addition, Study 2 considered potential mechanisms for the predicted effects; specifically, general empathy, affective empathy towards Blacks, cognitive empathy towards Blacks, positive mood, and negative mood. Unfortunately, using subliminal empathy primes “backfired”, such that exposure to subliminal empathy primes (relative to no prime) heightened prejudice towards marginalized groups and Blacks, and led to stronger expectations that marginalized groups and Blacks improve their own situation. However, exposure to subliminal primes pairing empathy with Blacks (relative to subliminal empathy primes alone) resulted in less prejudice towards marginalized groups and more willingness to directly assist Blacks, as expected. Interestingly, exposure to subliminal primes of empathy paired with Blacks (vs. empathy alone) resulted in more pro-White bias on the implicit prejudice measure. Study 2 did not find that the potential mediators measured explained the effects found. Overall, the results of the present project do not provide strong support for the use of subliminal empathy primes for improving intergroup relations. In fact, the results of Study 2 suggest that the use of subliminal empathy primes may even backfire. The implications for intergroup research on empathy and priming procedures generally are discussed.
Resumo:
La division cellulaire asymétrique (DCA) consiste en une division pendant laquelle des déterminants cellulaires sont distribués préférentiellement dans une des deux cellules filles. Par l’action de ces déterminants, la DCA générera donc deux cellules filles différentes. Ainsi, la DCA est importante pour générer la diversité cellulaire et pour maintenir l’homéostasie de certaines cellules souches. Pour induire une répartition asymétrique des déterminants cellulaires, le positionnement du fuseau mitotique doit être très bien contrôlé. Fréquemment ceci génère deux cellules filles de tailles différentes, car le fuseau mitotique n’est pas centré pendant la mitose, ce qui induit un positionnement asymétrique du sillon de clivage. Bien qu’un complexe impliquant des GTPases hétérotrimériques et des protéines liant les microtubules au cortex ait été impliqué directement dans le positionnement du fuseau mitotique, le mécanisme exact induisant le positionnement asymétrique du fuseau durant la DCA n'est pas encore compris. Des études récentes suggèrent qu’une régulation asymétrique du cytosquelette d’actine pourrait être responsable de ce positionnement asymétrique du faisceau mitotique. Donc, nous émettons l'hypothèse que des contractions asymétriques d’actine pendant la division cellulaire pourraient déplacer le fuseau mitotique et le sillon de clivage pour créer une asymétrie cellulaire. Nos résultats préliminaires ont démontré que le blebbing cortical, qui est une indication de tension corticale et de contraction, se produit préférentiellement dans la moitié antérieure de cellule précurseur d’organes sensoriels (SOP) pendant le stage de télophase. Nos données soutiennent l'idée que les petites GTPases de la famille Rho pourraient être impliqués dans la régulation du fuseau mitotique et ainsi contrôler la DCA des SOP. Les paramètres expérimentaux développés pour cette thèse, pour étudier la régulation de l’orientation et le positionnement du fuseau mitotique, ouvrirons de nouvelles avenues pour contrôler ce processus, ce qui pourrait être utile pour freiner la progression de cellules cancéreuses. Les résultats préliminaires de ce projet proposeront une manière dont les petites GTPases de la famille Rho peuvent être impliqués dans le contrôle de la division cellulaire asymétrique in vivo dans les SOP. Les modèles théoriques qui sont expliqués dans cette étude pourront servir à améliorer les méthodes quantitatives de biologie cellulaire de la DCA.
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
The basic premise of transaction-cost theory is that the decision to outsource, rather than to undertake work in-house, is determined by the relative costs incurred in each of these forms of economic organization. In construction the "make or buy" decision invariably leads to a contract. Reducing the costs of entering into a contractual relationship (transaction costs) raises the value of production and is therefore desirable. Commonly applied methods of contractor selection may not minimise the costs of contracting. Research evidence suggests that although competitive tendering typically results in the lowest bidder winning the contract this may not represent the lowest project cost after completion. Multi-parameter and quantitative models for contractor selection have been developed to identify the best (or least risky) among bidders. A major area in which research is still needed is in investigating the impact of different methods of contractor selection on the costs of entering into a contract and the decision to outsource.
Resumo:
With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.
Resumo:
A perennial issue for land use policy is the evaluation of landscape biodiversity and the associated cost effectiveness of any biodiversity conservation policy actions. Based on the CUA methodology as applied to species conservation, this paper develops a methodology for evaluating the impact on habitats of alternative landscape management scenarios. The method incorporates three dimensions of habitats, quantity change, quality change and relative scarcity, and is illustrated in relation to the alternative landscape management scenarios for the Scottish Highlands (Cairngorms) study area of the BioScene project. The results demonstrate the value of the method for evaluating biodiversity conservation policies through their impact on habitats.
Resumo:
The tagged microarray marker (TAM) method allows high-throughput differentiation between predicted alternative PCR products. Typically, the method is used as a molecular marker approach to determining the allelic states of single nucleotide polymorphisms (SNPs) or insertion-deletion (indel) alleles at genomic loci in multiple individuals. Biotin-labeled PCR products are spotted, unpurified, onto a streptavidin-coated glass slide and the alternative products are differentiated by hybridization to fluorescent detector oligonucleotides that recognize corresponding allele-specific tags on the PCR primers. The main attractions of this method are its high throughput (thousands of PCRs are analyzed per slide), flexibility of scoring (any combination, from a single marker in thousands of samples to thousands of markers in a single sample, can be analyzed) and flexibility of scale (any experimental scale, from a small lab setting up to a large project). This protocol describes an experiment involving 3,072 PCRs scored on a slide. The whole process from the start of PCR setup to receiving the data spreadsheet takes 2 d.
Resumo:
Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.
IQ in children with autism spectrum disorders: data from the Special Needs and Autism Project (SNAP)
Resumo:
Background Autism spectrum disorder (ASD) was once considered to be highly associated with intellectual disability and to show a characteristic IQ profile, with strengths in performance over verbal abilities and a distinctive pattern of ‘peaks’ and ‘troughs’ at the subtest level. However, there are few data from epidemiological studies. Method Comprehensive clinical assessments were conducted with 156 children aged 10–14 years [mean (s.d.)=11.7 (0.9)], seen as part of an epidemiological study (81 childhood autism, 75 other ASD). A sample weighting procedure enabled us to estimate characteristics of the total ASD population. Results Of the 75 children with ASD, 55% had an intellectual disability (IQ<70) but only 16% had moderate to severe intellectual disability (IQ<50); 28% had average intelligence (115>IQ>85) but only 3% were of above average intelligence (IQ>115). There was some evidence for a clinically significant Performance/Verbal IQ (PIQ/VIQ) discrepancy but discrepant verbal versus performance skills were not associated with a particular pattern of symptoms, as has been reported previously. There was mixed evidence of a characteristic subtest profile: whereas some previously reported patterns were supported (e.g. poor Comprehension), others were not (e.g. no ‘peak’ in Block Design). Adaptive skills were significantly lower than IQ and were associated with severity of early social impairment and also IQ. Conclusions In this epidemiological sample, ASD was less strongly associated with intellectual disability than traditionally held and there was only limited evidence of a distinctive IQ profile. Adaptive outcome was significantly impaired even for those children of average intelligence.
Resumo:
This EU funded 'HealthyHay'project stablished a sainfoin (Onobrychis vicifolia) germplasm bank at NIAB, Cambridge, with 306 accessions from around the world. A screening method was developed to characterise tannins by thiolytic degradation [1] directly in green plants for the first time. the method was validated by separate analysis of unextractable, extractable and purified tannins using thiolysis, HPLC-GPC and MALDI-TOF MS. Most tannins (58 to 73% of the total) could be recovered after Toyopearl HW50 fractionation with water, aqueous methanol and acetone. the greatest losses during purification occurred amongst larger molecular weight tannins with mean degree of polymerisation (mDP) > 18. The composition of water-,aqueous methanol- and acetone-soluble tannins differed considerably in their mDP and trans/cis ratios, but not in their prodelphinidin/orocyanidin (PD/PC) ratios.
Resumo:
Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.
Resumo:
Landscape restoration has the potential to mitigate habitat loss and fragmentation. However, restoration can take decades to reach the ecological conditions of the target habitats. The National Trust’s Stonehenge Landscape Restoration Project provides an opportunity to evaluate the ecological benefits against the economic and temporal costs. A field survey between June and September 2010 using Lepidoptera as bio-indicators showed that restored grasslands can approach the ecological conditions of the target chalk grassland habitat within 10 years. However, specialist species like Lysandra bellargus (Adonis blue) were absent from restored grasslands and may require additional management to assist their colonisation. Analysis of the Lepidoptera communities showed that both small-scale habitat heterogeneity and age of the habitat were important for explaining Lepidoptera occurrence. These results demonstrate that habitat restoration at the landscape scale combined with appropriate site-scale management can be a relatively rapid and effective method to restore ecological networks and buffer against future climate change.
Resumo:
Healthcare information systems have the potential to enhance productivity, lower costs, and reduce medication errors by automating business processes. However, various issues such as system complexity and system abilities in a relation to user requirements as well as rapid changes in business needs have an impact on the use of these systems. In many cases failure of a system to meet business process needs has pushed users to develop alternative work processes (workarounds) to fill this gap. Some research has been undertaken on why users are motivated to perform and create workarounds. However, very little research has assessed the consequences on patient safety. Moreover, the impact of performing these workarounds on the organisation and how to quantify risks and benefits is not well analysed. Generally, there is a lack of rigorous understanding and qualitative and quantitative studies on healthcare IS workarounds and their outcomes. This project applies A Normative Approach for Modelling Workarounds to develop A Model of Motivation, Constraints, and Consequences. It aims to understand the phenomenon in-depth and provide guidelines to organisations on how to deal with workarounds. Finally the method is demonstrated on a case study example and its relative merits discussed.