978 resultados para Must -- Analysis
Resumo:
Diet analysis and advice for patients with tooth wear is potentially the most logical intervention to arrest attrition, erosion and abrasion. It is saliva that protects the teeth against corrosion by the acids which soften enamel and make it susceptible to wear. Thus the lifestyles and diet of patients at risk need to be analysed for sources of acid and reasons for lost salivary protection. Medical conditions which put patients at risk of tooth wear are principally: asthma, bulimia nervosa, caffeine addiction, diabetes mellitus, exercise dehydration, functional depression, gastroesophageal reflux in alcoholism, hypertension and syndromes with salivary hypofunction. The sources of acid are various, but loss of salivary protection is the common theme. In healthy young Australians, soft drinks are the main source of acid, and exercise dehydration the main reason for loss of salivary protection. In the medically compromised, diet acids and gastroesophageal reflux are the sources, but medications are the main reasons for lost salivary protection. Diet advice for patients with tooth wear must: promote a healthy lifestyle and diet strategy that conserves the teeth by natural means of salivary stimulation; and address the specific needs of the patients' oral and medical conditions. Individualised, patient-empowering erosion WATCH strategies; on Water, Acid, Taste, Calcium and Health, are urgently required to combat the emerging epidemic of tooth wear currently being experienced in westernised societies.
Resumo:
The tartrate-resistant acid phosphatase (TRAP) is present in multiple tissues, including kidney, liver, lung, spleen, and bone. Recent study of (TRAP) gene expression has provided evidence for distinct promoters within the (TRAP) gene, suggesting that the gene has alternative, tissue-preferred mRNA transcripts. Examination of endogenous (TRAP) exon 1B and 1C mRNA transcripts revealed tissue-preferred transcript abundance with increased exon 1B transcripts detected in liver and kidney and increased exon 1C transcripts detected in bone and spleen. In this investigation, we have made transgenic mice that express a marker gene driven by two candidate promoters, designated BC and C, within the (TRAP) gene. The BC and C promoters are 2.2 and 1.6 kb, respectively, measured from the translation initiation site. Evaluation of BC transgenic lines demonstrated robust expression in multiple tissues. In contrast, significant transgene expression was not detected in C transgenic lines. Evaluation of transgene mRNAs in BC transgenic lines revealed that virtually all expression was in the form of B transcripts, suggesting that the tissue-preferred pattern of endogenous (TRAP) was not replicated in the BC transgenic line. Likewise, osteoclastogenic cultures from BC, but not C, transgenic bone marrow cells expressed the transgene following receptor activator of NFkappaB ligand/macrophage colony-stimulating factor stimulation. In conclusion, when compared with the 2.2-kb BC portion of the (TRAP) promoter region, the 1.6-kb C portion does not account for significant gene expression in vivo or in vitro; production of the bone- and spleen-preferred (TRAP) C transcript must depend on regulatory elements outside of the 2.2-kb promoter. As the majority of currently investigated transcription factors that influence transcriptional regulation of osteoclast gene expression bind within the 1.6-kb C portion of the (TRAP) promoter, it is likely that transcription binding sites outside of the 2.2-kb region will have profound effects on regulation of the gene in vivo and in vitro.
Resumo:
Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.
Resumo:
Microarrays are used to monitor the expression of thousands of gene transcripts. This technique requires high-quality RNA, which can be extracted from a variety sources, including autopsy brain tissue. Most nucleic acids and proteins are reasonably stable post mortem. However, their abundance and integrity can exhibit marked intraand inter-subject variability, so care must be taken when comparisons between case-groups are made. We will review issues associated with the sampling of RNA from autopsy brain tissue in relation to various ante- and post-mortem factors.
Resumo:
To make vision possible, the visual nervous system must represent the most informative features in the light pattern captured by the eye. Here we use Gaussian scale-space theory to derive a multiscale model for edge analysis and we test it in perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, Gaussian first derivative filter provides the input to a Gaussian second derivative filter. Crucially, the output at each stage is half-wave rectified before feeding forward to the next. This creates nonlinear channels selectively responsive to one edge polarity while suppressing spurious or "phantom" edges. The two stages have properties analogous to simple and complex cells in the visual cortex. Edges are found as peaks in a scale-space response map that is the output of the second stage. The position and scale of the peak response identify the location and blur of the edge. The model predicts remarkably accurately our results on human perception of edge location and blur for a wide range of luminance profiles, including the surprising finding that blurred edges look sharper when their length is made shorter. The model enhances our understanding of early vision by integrating computational, physiological, and psychophysical approaches. © ARVO.
Resumo:
To carry out an analysis of variance, several assumptions are made about the nature of the experimental data which have to be at least approximately true for the tests to be valid. One of the most important of these assumptions is that a measured quantity must be a parametric variable, i.e., a member of a normally distributed population. If the data are not normally distributed, then one method of approach is to transform the data to a different scale so that the new variable is more likely to be normally distributed. An alternative method, however, is to use a non-parametric analysis of variance. There are a limited number of such tests available but two useful tests are described in this Statnote, viz., the Kruskal-Wallis test and Friedmann’s analysis of variance.
Resumo:
Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.
Resumo:
The organic matter in five oil shales (three from the Kimmeridge Clay sequence, one from the Oxford Clay sequence and one from the Julia Creek deposits in Australia) has been isolated by acid demineralisation, separated into kerogens and bitumens by solvent extraction and then characterised in some detail by chromatographic, spectroscopic and degradative techniques. Kerogens cannot be characterised as easily as bitumens because of their insolubility, and hence before any detailed molecular information can be obtained from them they must be degraded into lower molecular weight, more soluble components. Unfortunately, the determination of kerogen structures has all too often involved degradations that were far too harsh and which lead to destruction of much of the structural information. For this reason a number of milder more selective degradative procedures have been tested and used to probe the structure of kerogens. These are: 1. Lithium aluminium hydride reduction. - This procedure is commonly used to remove pyrite from kerogens and it may also increase their solubility by reduction of labile functional groups. Although reduction of the kerogens was confirmed, increases in solubility were correlated with pyrite content and not kerogen reduction. 2. O-methylation in the presence of a phase transfer catalyst. - By the removal of hydrogen bond interactions via O-methylation, it was possible to determine the contribution of such secondary interactions to the insolubility of the kerogens. Problems were encountered with the use of the phase transfer catalyst. 3. Stepwise alkaline potassium permanganate oxidation. - Significant kerogen dissolution was achieved using this procedure but uncontrolled oxidation of initial oxidation products proved to be a problem. A comparison with the peroxytrifluoroaceticacid oxidation of these kerogens was made. 4. Peroxytrifluoroacetic acid oxidation. - This was used because it preferentially degrades aromatic rings whilst leaving any benzylic positions intact. Considerable conversion of the kerogens into soluble products was achieved with this procedure. At all stages of degradation the products were fully characterised where possible using a variety of techniques including elemental analysis, solution state 1H and 13C nuclear magnetic resonance, solid state 13C nuclear magnetic resonance, gel-permeationchromatography, gas chromatography-mass spectroscopy, fourier transform infra-red spectroscopy and some ultra violet-visible spectroscopy.
Resumo:
This research examines a behavioural based safety (BBS) intervention within a paper mill in the South East of England. Further to this intervention two other mills are examined for the purposes of comparison — one an established BBS programme and the other an improving safety management system through management ownership. BBS programmes have become popular within the UK, but most of the research about their efficacy is carried out by the BBS providers themselves. This thesis aims to evaluate a BBS intervention from a standpoint which is not commercially biased in favour of BBS schemes. The aim of a BBS scheme is to either change personnel behaviours or attitudes, which in turn will positively affect the organisation's safety culture. The research framework involved a qualitative methodology in order to examine the effects of the intervention on the paper mill's safety culture. The techniques used were questionnaires and semi structured interviews, in addition to observation and discussions which were possible because of the author's position as participant observer. The results demonstrated a failure to improve any aspect of the mill's safety culture, which worsened following the BBS intervention. Issues such as trust, morale, communication and support of management showed significant signs of negative workforce response. The paper mill where the safety management system approach was utilised demonstrated a significantly improved safety culture and achieved site ownership from middle managers and supervisors. Research has demonstrated that a solid foundation is required prior to successfully implementing a BBS programme. For a programme to work there must be middle management support in addition to senior management commitment. If a trade union actively distances itself from BBS, it is also unlikely to be effective. This thesis proposes that BBS observation programmes are not suitable for the papermaking industry, particularly when staffing levels are low due to challenging economic conditions. Observers are not available when there are high hazard situations and this suggests that BBS implementation is not the correct intervention for the paper industry.
Resumo:
An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.
Resumo:
The term "pharmacogenetics" has been defined as the scientific study of inherited factors that affect the human drug response. Many pharmacogenetie studies have been published since 1995 and have focussed on the principal enzyme family involved in drug metabolism, the cytochrome P450 family, particularly cytochrome P4502C9 and 2C19. In order to investigate the pharmacogenetic aspect of pharmacotherapy, the relevant studies describing the association of pharmacogenetic factor(s) in drug responses must be retrieved from existing literature using a systematic review approach. In addition, the estimation of variant allele prevalence for the gene under study between different ethnic populations is important for pharmacogenetic studies. In this thesis, the prevalence of CYP2C9/2C19 alleles between different ethnicities has been estimated through meta-analysis and the population genetic principle. The clinical outcome of CYP2C9/2C19 allelic variation on the pharmacotherapy of epilepsy has been investigated; although many new antiepileptic drugs have been launched into the market, carbamazepine, phenobarbital and phenytoin are still the major agents in the pharmacotherapy of epilepsy. Therefore, phenytoin was chosen as a model AED and the effect of CYP2C9/2C19 genetic polymorphism on phenytoin metabolism was further examined.An estimation of the allele prevalence was undertaken for three CYP2C9/2C19 alleles respectively using a meta-analysis of studies that fit the Hardy-Weinberg equilibrium. The prevalence of CYP2C9*1 is approximately 81%, 96%, 97% and 94% in Caucasian, Chinese, Japanese, African populations respectively; the pooled prevalence of CYP2C19*1 is about 86%, 57%, 58% and 85% in these ethnic populations respectively. However, the studies of association between CYP2C9/2C19 polymorphism and phenytoin metabolism failed to achieve any qualitative or quantitative conclusion. Therefore, mephenytoin metabolism was examined as a probe drug for association between CYP2C19 polymorphism and mephenytoin metabolic ratio. Similarly, analysis of association between CYP2C9 polymorphism and warfarin dose requirement was undertaken.It was confirmed that subjects carrying two mutated CYP2C19 alleles have higher S/R mephenytoin ratio due to deficient CYP2C19 enzyme activity. The studies of warfarin and CYP2C9 polymorphism did not provide a conclusive result due to poor comparability between studies.The genetic polymorphism of drug metabolism enzymes has been studied extensively, however other genetic factors, such as multiple drug resistance genes (MDR) and genes encoding ion channels, which may contribute to variability in function of drug transporters and targets, require more attention in future pharmacogenetic studies of antiepileptic drugs.
Resumo:
A detailed literature survey confirmed cold roll-forming to be a complex and little understood process. In spite of its growing value, the process remains largely un-automated with few principles used in set-up of the rolling mill. This work concentrates on experimental investigations of operating conditions in order to gain a scientific understanding of the process. The operating conditions are; inter-pass distance, roll load, roll speed, horizontal roll alignment. Fifty tests have been carried out under varied operating conditions, measuring section quality and longitudinal straining to give a picture of bending. A channel section was chosen for its simplicity and compatibility with previous work. Quality measurements were measured in terms of vertical bow, twist and cross-sectional geometric accuracy, and a complete method of classifying quality has been devised. The longitudinal strain profile was recorded, by the use of strain gauges attached to the strip surface at five locations. Parameter control is shown to be important in allowing consistency in section quality. At present rolling mills are constructed with large tolerances on operating conditions. By reduction of the variability in parameters, section consistency is maintained and mill down-time is reduced. Roll load, alignment and differential roll speed are all shown to affect quality, and can be used to control quality. Set-up time is reduced by improving the design of the mill so that parameter values can be measured and set, without the need for judgment by eye. Values of parameters can be guided by models of the process, although elements of experience are still unavoidable. Despite increased parameter control, section quality is variable, if only due to variability in strip material properties. Parameters must therefore be changed during rolling. Ideally this can take place by closed-loop feedback control. Future work lies in overcoming the problems connected with this control.
Resumo:
HowHow precisely do media influence their readers, listeners and viewers? In this paper, we argue that any serious study of the psychology of media influence must incorporate a systematic analysis of media material. However, psychology presently lacks a methodology for doing this that is sensitive to context, relying on generalised methods like content or discourse analysis. In this paper, we develop an argument to support our development of a technique that we have called Media Framing Analysis (MFA), a formal procedure for conducting analyses of (primarily news) media texts. MFA draws on elements of existing framing research from communication and other social scientific research while at the same time incorporating features of particular relevance to psychology, such as narrative and characterisation.
Resumo:
In this Thesis, details of a proposed method for the elastic-plastic failure load analysis of complete building structures are given. In order to handle the problem, a computer programme in Atlas Autocode is produced. The structures consist of a number of parallel shear walls and intermediate frames connected by floor slabs. The results of an experimental investigation are given to verify the theoretical results and to demonstrate various factors that may influence the behaviour of these structures. Large full scale practical structures are also analysed by the proposed method and suggestions are made for achieving design economy as well as for extending research in various aspects of this field. The existing programme for elastic-plastic analysis of large frames is modified to allow for the effect of composite action of structural members, i.e. reinforced concrete floor slabs and the supporting steel beams. This modified programme is used to analyse some framed type structures with composite action as well as those which incorporate plates and shear walls. The results obtained are studied to ascertain the influence of composite action and other factors on the load carrying capacity of both bare frames and complete building structures. The theoretical failure load presented in this thesis does not predict the overall failure load of the structure nor does it predict the partial failure load of the shear walls and slabs but it merely predicts the partial failure load of a single frame and assumes that the loss of stiffess of such a frame renders the overall structure unusable. For most structures the analysis proposed in this thesis is likely to break down prematurely due to the failure of the slab and shear wall system and this factor must be taken into account in any future work on such structures. The experimental work reported in this thesis is acknowledged to be unsatisfactory as a verification of the limited theory proposed. In particular perspex was not found to be a suitable material for testing at high loads, micro-concrete may be more suitable.
Resumo:
In the general introduction of the road-accident phenomenon inside and outside Iran, the results of previous research-works and international conferences and seminars on road-safety have been reviewed. Also a sample-road between Tehran and Mashad has been investigated as a case-study. Examining the road-accident data and iriformation,first: the information presented in road-accident report-forms in developed countries is discussed and, second: the procedures for road-accident data collection in Iran are investigated in detail. The data supplied by Iran Road-Police Central Statistics Office, is analysed, different rates are computed, due comparisons with other nations are made, and the results are discussed. Also such analysis and comparisons are presented for different provinces of Iran. It is concluded that each province with its own natural, geographical, social and economical characteristics possesses its own reasons for the quality and quantity of road-accidents and therefore must receive its own appropriate remedial solutions. The question~ of "what is the cost of road-accidents", "why and how evaluate the cost", "what is the appropriate way of approach to such evaluation" are all discussed and then "the cost of road-accidents in Iran" based on two different approaches: "Gross National Output"and "court award" is computed. It is concluded that this cost is about 1.5 per cent of the country's national product. In Appendix 3 an impressive example is given of the trend of costs and benefits that can be attributed to investment in road-safety measures.