74 resultados para information studies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study is a two-part study starting with a nationwide survey in the private sector. The hypotheses derived from the Western literature were not significantly supported when the results were analyzed. It seems that the existing literature related to the phenomenon under investigation is mainly Anglo-Saxon culture oriented which is different from the Malaysian culture where the study was conducted. However, access barriers to private sector organizations shifted the focus of the research to the second part of the study that examined the issues in detail in four public sector organizations currently implementing accounting information systems – two hospitals and two universities. In the second part of the main study, the researcher developed formal and substantive propositions from the qualitative interviews which were substantiated using a cross-case analysis; as a result, a model for accountants’ participation in AIS implementation is proposed. The research shows that the process of influencing accountants to participate in AIS implementation is more complex than the literature suggests. There were many issues that surfaced during the case studies, such as conflict and empowerment which set a foundation for further research about how participation can be secured to help make the implementation of AIS part of an organizational agenda success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of surfactant monolayers is certainly not a new technique, but the application of monolayer studies to elucidate controlling factors in liposome design remains an underutilised resource. Using a Langmuir-Blodgett trough, pure and mixed lipid monolayers can be investigated, both for their interactions within the monolayer, and for interfacial interactions with drugs in the aqueous sub-phase. Despite these monolayers effectively being only half a bilayer, with a flat rather than curved structure, information from these studies can be effectively translated into liposomal systems. Here we outline the background, general protocols and application of Langmuir studies with a focus on their application in liposomal systems. A range of case studies are discussed which show how the system can be used to support its application in the development of liposome drug delivery. Examples include investigations into the effect of cholesterol within the liposome bilayer, understanding effective lipid packaging within the bilayer to promote water soluble and poorly soluble drug retention, the effect of alkyl chain length on lipid packaging, and drug-monolayer electrostatic interactions that promote bilayer repackaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An ultra high vacuum system capable of attaining pressures of 10-12 mm Hg was used for thermal desorption experiments. The metal chosen for these experiments was tantalum because of its suitability for thermal desorption experiments and because relatively little work has been done using this metal. The gases investigated were carbon monoxide, hydrogen and ethylene. The kinetic and thermodynamic parameters relating to the desorption reaction were calculated and the values obtained related to the reaction on the surface. The thermal desorption reaction was not capable of supplying all the information necessary to form a complete picture of the desorption reaction. Further information was obtained by using a quadrupole mass spectrometer to analyse the desorbed species. The identification of the desorbed species combined with the value of the desorption parameters meant that possible adatom structures could be postulated. A combination of these two techniques proved to be a very powerful tool when investigating gas-metal surface reactions and gave realistic values for the measured parameters such as the surface coverage, order of reaction, the activation energy and pre-exponential function for desorption. Electron microscopy and X-ray diffraction were also used to investigate the effect of the gases on the metal surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project is concerned with the deterioration of surface coatings as a result of weathering and exposure to a pollutant gas (in this case nitric oxide). Poly(vinyl chloride) (PVC) plastisol surface coatings have been exposed to natural and artificial weathering and a comparison of the effects of these two types of weathering has been made by use of various analytical techniques. These techniques have each been assessed as to their value in providing information regarding changes taking place in the coatings during ageing, and include, goniophotometry, micro-penetrometry, surface energy measurements, weight loss measurements, thermal analysis and scanning electron microscopy. The results of each of these studies have then been combined to show the changes undergone by PVC plastisol surface coatings during ageing and to show the effects which additives to the coatings have on their behaviour and in particular the effects of plasticiser, pigment and uv and thermal stabilisers. Finally a preliminary study of the interaction between five commercial polymers and nitric oxide has been carried out, the polymers being polypropylene, cellulose acetate butyrate, polystyrene, polyethylene terephthalate and polycarbonate. Each of the samples was examined using infra-red spectroscopy in the transmission mode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The investigations described in this thesis concern the molecular interactions between polar solute molecules and various aromatic compounds in solution. Three different physical methods were employed. Nuclear magnetic resonance (n.m.r.) spectroscopy was used to determine the nature and strength of the interactions and the geometry of the transient complexes formed. Cryoscopic studies were used to provide information on the stoichiometry of the complexes. Dielectric constant studies were conducted in an attempt to confirm and supplement the spectroscopic investigations. The systems studied were those between nitromethane, chloroform, acetonitrile (solutes) and various methyl substituted benzenes. In the n.m.r. work the dependence of the solute chemical shift upon the compositions of the solutions was determined. From this the equilibrium quotients (K) for the formation of each complex and the shift induced in the solute proton by the aromatic in the complex were evaluated. The thermodynamic parameters for the interactions were obtained from the determination of K at several temperatures. The stoichiometries of the complexes obtained from cryoscopic studies were found to agree with those deduced from spectroscopic investigations. For most systems it is suggested that only one type of complex, of 1:1 stiochiometry, predominates except that for the acetonitrile-benzene system a 1:2 complex is formed. Two sets of dielectric studies were conducted, the first to show that the nature of the interaction is dipole-induced dipole and the second to calculate K. The equilibrium quotients obtained from spectroscopic and dielectric studies are compared. Time-averaged geometries of the complexes are proposed. The orientation of solute, with respect to the aromatic for the 1:1 complexes, appears to be the one in which the solute lies symmetrically about the aromatic six-fold axis whereas for the 1:2 complex, a sandwich structure is proposed. It is suggested that the complexes are formed through a dipole-induced dipole interaction and steric factors play some part in the complex formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research concerns information systems and information systems development. The thesis describes an approach to information systems development called Multiview. This is a methodology which seeks to combine the strengths of a number of different, existing approaches in a coherent manner. Many of these approaches are radically different in terms of concepts, philosophy, assumptions, methods, techniques and tools. Three case studies are described presenting Multiview 'in action'. The first is used mainly to expose the strengths and weaknesses of an early version of the approach discussed in the thesis. Tools and techniques are described in the thesis which aim to strengthen the approach. Two further case studies are presented to illustrate the use of this second version of Multiview. This is not put forward as an 'ideal methodology' and the case studies expose some of the difficulties and practical problems of information systems work and the use of the methodology. A more contingency based approach to information systems development is advocated using Multiview as a framework rather than a prescriptive tool. Each information systems project and the use of the framework is unique, contingent on the particular problem situation. The skills of different analysts, the backgrounds of users and the situations in which they are constrained to work have always to be taken into account in any project. The realities of the situation will cause departure from the 'ideal methodology' in order to allow for the exigencies of the real world. Multiview can therefore be said to be an approach used to explore the application area in order to develop an information system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In previous sea-surface variability studies, researchers have failed to utilise the full ERS-1 mission due to the varying orbital characteristics in each mission phase, and most have simply ignored the Ice and Geodetic phases. This project aims to introduce a technique which will allow the straightforward use of all orbital phases, regardless of orbit type. This technique is based upon single satellite crossovers. Unfortunately the ERS-1 orbital height is still poorly resolved (due to higher air drag and stronger gravitational effects) when compared with that of TOPEX/Poseidon (T/P), so to make best use of the ERS-1 crossover data corrections to the ERS-1 orbital heights are calculated by fitting a cubic-spline to dual-crossover residuals with T/P. This correction is validated by comparison of dual satellite crossovers with tide gauge data. The crossover processing technique is validated by comparing the extracted sea-surface variability information with that from T/P repeat pass data. The two data sets are then combined into a single consistent data set for analysis of sea-surface variability patterns. These patterns are simplified by the use of an empirical orthogonal function decomposition which breaks the signals into spatial modes which are then discussed separately. Further studies carried out on these data include an analysis of the characteristics of the annual signal, discussion of evidence for Rossby wave propagation on a global basis, and finally analysis of the evidence for global mean sea level rise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis begins by providing a review of techniques for interpreting the thermal response at the earth's surface acquired using remote sensing technology. Historic limitations in the precision with which imagery acquired from airborne platforms can be geometrically corrected and co-registered has meant that relatively little work has been carried out examining the diurnal variation of surface temperature over wide regions. Although emerging remote sensing systems provide the potential to register temporal image data within satisfactory levels of accuracy, this technology is still not widely available and does not address the issue of historic data sets which cannot be rectified using conventional parametric approaches. In overcoming these problems, the second part of this thesis describes the development of an alternative approach for rectifying airborne line-scanned imagery. The underlying assumption that scan lines within the imagery are straight greatly reduces the number of ground control points required to describe the image geometry. Furthermore, the use of pattern matching procedures to identify geometric disparities between raw line-scanned imagery and corresponding aerial photography enables the correction procedure to be almost fully automated. By reconstructing the raw image data on a truly line-by-line basis, it is possible to register the airborne line-scanned imagery to the aerial photography with an average accuracy of better than one pixel. Providing corresponding aerial photography is available, this approach can be applied in the absence of platform altitude information allowing multi-temporal data sets to be corrected and registered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This collection of papers records a series of studies, carried out over a period of some 50 years, on two aspects of river pollution control - the prevention of pollution by sewage biological filtration and the monitoring of river pollution by biological surveillance. The earlier studies were carried out to develop methods of controlling flies which bred in the filters and caused serious nuisance and possible public health hazard, when they dispersed to surrounding villages. Although the application of insecticides proved effective as an alleviate measure, because it resulted in only a temporary disturbance of the ecological balance, it was considered ecologically unsound as a long-term solution. Subsequent investigations showed that the fly populations in filters were largely determined by the amount of food available to the grazing larval stage in the form of filter film. It was also established that the winter deterioration in filter performance was due to the excessive accumulation of film. Subsequent investigations were therefore carried out to determine the factors responsible for the accumulation of film in different types of filter. Methods of filtration which were considered to control film accumulation by increasing the flushing action of the sewage, were found to control fungal film by creating nutrient limiting conditions. In some filters increasing the hydraulic flushing reduced the grazing fauna population in the surface layers and resulted in an increase in film. The results of these investigations were successfully applied in modifying filters and in the design of a Double Filtration process. These studies on biological filters lead to the conclusion that they should be designed and operated as ecological systems and not merely as hydraulic ones. Studies on the effects of sewage effluents on Birmingham streams confirmed the findings of earlier workers justifying their claim for using biological methods for detecting and assessing river pollution. Further ecological studies showed the sensitivity of benthic riffle communities to organic pollution. Using experimental channels and laboratory studies the different environmental conditions associated with organic pollution were investigated. The degree and duration of the oxygen depletion during the dark hours were found to be a critical factor. The relative tolerance of different taxa to other pollutants, such as ammonia, differed. Although colonisation samplers proved of value in sampling difficult sites, the invertebrate data generated were not suitable for processing as any of the commonly used biotic indexes. Several of the papers, which were written by request for presentation at conferences etc., presented the biological viewpoint on river pollution and water quality issues at the time and advocated the use of biological methods. The information and experiences gained in these investigations was used as the "domain expert" in the development of artificial intelligence systems for use in the biological surveillance of river water quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is known that parallel pathways exist within the visual system. These have been described as magnocellular and parvocellular as a result of the layered organisation of the lateral geniculate nucleus and extend from the retina to the cortex. Dopamine (DA) and acetylcholine (ACH) are neurotransmitters that are present in the visual pathway. DA is present in the retina and is associated with the interplexiform cells and horizontal cells. ACH is also present in the retina and is associated with displaced amacrine cells; it is also present in the superior colliculus. DA is found to be significantly depleted in the brain of Parkinson's disease (PD) patients and ACH in Alzheimer's disease (AD) patients. For this reason these diseases were used to assess the function of DA and ACH in the electrophysiology of the visual pathway. Experiments were conducted on young normals to design stimuli that would preferentially activate the magnocellular or parvocellular pathway. These stimuli were then used to evoke visual evoked potentials (VEP) in patients with PD and AD, in order to assess the function of DA and ACH in the visual pathway. Electroretinograms (ERGs) were also measured in PD patients to assess the role of DA in the retina. In addition, peripheral ACH function was assessed by measuring VEPs, ERGs and contrast sensitivity (CS) in young normals following the topical instillation of hyoscine hydrobromide (an anticholinergic drug). The results indicate that the magnocellular pathway can be divided into two: a cholinergic tectal-association area pathway carrying luminance information, and a non-cholinergic geniculo-cortical pathway carrying spatial information. It was also found that depletion of DA had very little effect on the VEPs or ERGs, confirming a general regulatory function for this neurotransmitter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a multiple case study of the leadership language of three senior women working in a large corporation in Bahrain. The study’s main aim is to explore the linguistic practices the women leaders use with their colleagues and subordinates in corporate meetings. Adopting a Foucauldian (1972) notion of ‘discourses’ as social practices and a view of gender as socially constructed and discursively performed (Butler 1990), this research aims to unveil the competing discourses which may shape the leadership language of senior women in their communities of practice. The research is situated within the broader field of Sociolinguistics and the specific field of Language and Gender. To address the research aim, a case study approach incorporating multiple methods of qualitative data collection (observation, interviews, and shadowing) was utilised to gather information about the three women leaders and produce a rich description of their use of language in and out of meeting contexts. For analysis, principles of Qualitative Data Analysis (QDA) were used to organise and sort the large amount of data. Also, Feminist Post- Structuralist Discourse Analysis (FPDA) was adopted to produce a multi-faceted analysis of the subjects, their language leadership, power relations, and competing discourses in the context. It was found that the three senior women enact leadership differently making variable use of a repertoire of conventionally masculine and feminine linguistic practices. However, they all appear to have limited language resources and even more limiting subject positions; and they all have to exercise considerable linguistic expertise to police and modify their language in order to avoid the ‘double bind’. Yet, the extent of this limitation and constraints depends on the community of practice with its prevailing discourses, which appear to have their roots in Islamic and cultural practices as well as some Western influences acquired throughout the company’s history. It is concluded that it may be particularly challenging for Middle Eastern women to achieve any degree of equality with men in the workplace because discourses of Gender difference lie at the core of Islamic teaching and ideology.