23 resultados para Lines of credit

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

1. The role of individual residues in the 8-18 helix of CGRP 8-37 in promoting high-affinity binding to CGRP 1 receptors expressed on rat L6 and human SK-N-MC cells has been examined. The relative potencies of various derivatives were estimated from their ability to inhibit the human αCGRP-mediated increase in cyclic AMP production and the binding of [ 125I]-human αCGRP. 3. Arg 11 and Arg 18 were replaced by serines to give [Ser 11.18]CGRP 8-37. These bound with pKi values <6 to SK-N-MC cells and had apparent pA 2 values of 5.81 ± 0.04 and 5.31 ± 0.11 on SK-N-MC and L6 cells. CGRP 8-37 had a pKi of 8.22 on SK-N-MC cells and pK b values on the above cell lines of 8.95±0.04 and 8.76±0.04. 3. The arginines were replaced with glutamic acid residues. [Glu 11]CGRP 8-37 had a pK b of 7.14±0.14 on SK-N-MC cells (pKi=7.05±0.05) and 6.99±0.08 on L6 cells. [Glu 18]CGRP 8-37 had a pK b of 7.10±0.0.08 on SK-N-MC cells (pKi=6.91±0.23) and 7.12±0.09 on L6 cells. 4. Leu 12, Leu 15 and Leu 16 were replaced by benzoyl-phenylalanine (bpa) residues. On SK-N-MC cells, the apparent pA 2 values of [bpa 12]-, [bpa 15]- and [bpa 16]CGRP 8-37 were respectively 7.43±0.23, 8.34±0.11 and 5.66±0.16 (pKi values of 7.14±0.17, 7.66±0.21 and <6): on L6 cells they were 7.96±0.36, 8.28±0.21 and 6.09±0.04 (all n=3). 5. It is concluded that the Arg 11 and Arg 18 are involved in specific electrostatic interactions with other residues, either on the CGRP 1 receptors or elsewhere on CGRP 8-37. Leu 16 is in a conformationally restricted site when CGRP 8-37 binds to CGRP 1 receptors, unlike Leu 12 and Leu 15.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Collateral - generally defined as an asset used to provide security for a lender's loan - is an important feature of credit contracts and all the available evidence suggests that its use is getting more pervasive. This informative book builds upon recent research into this topic. Sena analyses three case-studies that revolve around the impact that financial constraints have on economic outcomes. In the first case-study, the relationship between firms' technical efficiency and increasing financial pressure is explored. The author then goes on to show, in the second case study, that under specific circumstances, increasing financial pressure and increasing product market competition can jointly have a positive impact on firms' technical efficiency, while not being true for all types of firms. In the third case, she analyses the impact that finance constraints have on women's start-ups. Unique and revealing, this is the first book to deal so extensively with the topic of collateral, and as such, is a valuable reference to postgraduates and professionals in the fields of macroeconomics, monetary and business economics. © 2008 Vania Sena. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study integrates research on minority dissent and individual creativity, as well as team diversity and the quality of group decision making, with research on team participation in decision making. From these lines of research, it was proposed that minority dissent would predict innovation in teams but only when teams have high levels of participation in decision making. This hypothesis was tested in 2 studies, 1 involving a homogeneous sample of self-managed teams and 1 involving a heterogeneous sample of cross-functional teams. Study 1 suggested that a newly developed scale to measure minority dissent has discriminant validity. Both Study 1 and Study 2 showed more innovations under high rather than low levels of minority dissent but only when there was a high degree of participation in team decision making. It is concluded that minority dissent stimulates creativity and divergent thought, which, through participation, manifest as innovation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is known that distillation tray efficiency depends on the liquid flow pattern, particularly for large diameter trays. Scale·up failures due to liquid channelling have occurred, and it is known that fitting flow control devices to trays sometirr.es improves tray efficiency. Several theoretical models which explain these observations have been published. Further progress in understanding is at present blocked by lack of experimental measurements of the pattern of liquid concentration over the tray. Flow pattern effects are expected to be significant only on commercial size trays of a large diameter and the lack of data is a result of the costs, risks and difficulty of making these measurements on full scale production columns. This work presents a new experiment which simulates distillation by water cooling. and provides a means of testing commercial size trays in the laboratory. Hot water is fed on to the tray and cooled by air forced through the perforations. The analogy between heat and mass transfer shows that the water temperature at any point is analogous to liquid concentration and the enthalpy of the air is analogous to vapour concentration. The effect of the liquid flow pattern on mass transfer is revealed by the temperature field on the tray. The experiment was implemented and evaluated in a column of 1.2 m. dia. The water temperatures were measured by thennocouples interfaced to an electronic computerised data logging system. The "best surface" through the experimental temperature measurements was obtained by the mathematical technique of B. splines, and presented in tenos of lines of constant temperature. The results revealed that in general liquid channelling is more imponant in the bubbly "mixed" regime than in the spray regime. However, it was observed that severe channelling also occurred for intense spray at incipient flood conditions. This is an unexpected result. A computer program was written to calculate point efficiency as well as tray efficiency, and the results were compared with distillation efficiencies for similar loadings. The theoretical model of Porter and Lockett for predicting distillation was modified to predict water cooling and the theoretical predictions were shown to be similar to the experimental temperature profiles. A comparison of the repeatability of the experiments with an errors analysis revealed that accurate tray efficiency measurements require temperature measurements to better than ± 0.1 °c which is achievable with conventional techniques. This was not achieved in this work, and resulted in considerable scatter in the efficiency results. Nevertheless it is concluded that the new experiment is a valuable tool for investigating the effect of the liquid flow pattern on tray mass transfer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The roots of the concept of cortical columns stretch far back into the history of neuroscience. The impulse to compartmentalise the cortex into functional units can be seen at work in the phrenology of the beginning of the nineteenth century. At the beginning of the next century Korbinian Brodmann and several others published treatises on cortical architectonics. Later, in the middle of that century, Lorente de No writes of chains of ‘reverberatory’ neurons orthogonal to the pial surface of the cortex and called them ‘elementary units of cortical activity’. This is the first hint that a columnar organisation might exist. With the advent of microelectrode recording first Vernon Mountcastle (1957) and then David Hubel and Torsten Wiesel provided evidence consistent with the idea that columns might constitute units of physiological activity. This idea was backed up in the 1970s by clever histochemical techniques and culminated in Hubel and Wiesel’s well-known ‘ice-cube’ model of the cortex and Szentogathai’s brilliant iconography. The cortical column can thus be seen as the terminus ad quem of several great lines of neuroscientific research: currents originating in phrenology and passing through cytoarchitectonics; currents originating in neurocytology and passing through Lorente de No. Famously, Huxley noted the tragedy of a beautiful hypothesis destroyed by an ugly fact. Famously, too, human visual perception is orientated toward seeing edges and demarcations when, perhaps, they are not there. Recently the concept of cortical columns has come in for the same radical criticism that undermined the architectonics of the early part of the twentieth century. Does history repeat itself? This paper reviews this history and asks the question.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies have shown that the brand “owner” is very influential in positioning the brand and when the brand “owner” ceases his or her active role the brand will be perceived differently by the consumers. Balance Theory (HBT), a cognitive psychological theory, studies the triadic relationships between two persons and an entity and predicts that when a person’s original perception of the relationship is disturbed, the person restructures to a new balanced perception. Consequently, this research was undertaken to: conceptualize the brand owner’s impact on consumer’s brand perception; test the applicability of both the static and dynamic predictions of the Heider’s Balance Theory in brand owner-consumer-brand relation (OCB); construct and test a model of brand owner-consumer-brand relation; and examine if personality has an influence on OCB. A discovery-oriented approach was taken to understand the selected market segment, the ready-to-wear and diffusion lines of international designer labels. Chinese Brand Personality Scale, fashion proneness and hedonic and utilitarian shopping scales were developed, and validated. 51 customers were surveyed. Both traditional and extended methods used in the Balance Theory were employed in this study. Responses to liked brand have been used to test and develop the model, while those for disliked brand were used for test and confirmation. A “what if’ experimental approach was employed to test the applicability of dynamic HBT theory in OCB Model. The hypothesized OCB Model has been tested and validated. Consumers have been found to have separate views on the brand and the brand owner; and their responses to contrasting ethical and non-ethical news of the brand owner are different. Personality has been found to have an influence and two personality adapted models have been tested and validated. The actual results go beyond the prediction of the Balance Theory. Dominant triple positive balance mode, dominant negative balance mode, and mode of extreme antipathy have been found. It has been found that not all balanced modes are good for the brand. Contrary to Heider’s findings, simply liking may not necessarily lead to unit relation in the OCB Model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of new products in today's marketing environment is generally accepted as a requirement for the continual growth and prosperity of organisations. The literature is consequently rich with information on the development of various aspects of good products. In the case of service industries, it can be argued that new service product development is of as least equal importance as it is to organisations that produce tangible goods products. Unlike the new goods product literature, the literature on service marketing practices, and in particular, new service product development, is relatively sparse. The main purpose of this thesis is to examine a number of aspects of new service product development practice with respect to financial services and specifically, credit card financial services. The empirical investigation utilises both a case study and a survey approach, to examine aspects of new service product development industry practice relating specifically to gaps and deficiencies in the literature with respect to the financial service industry. The findings of the empirical work are subsequently examined in the context in which they provide guidance and support for a new normative new service product development model. The study examines the UK credit card financial service product sector as an industry case study and perspective. The findings of the field work reveal that the new service product development process is still evolving, and that in the case of credit card financial services can be seen as a well-structured and well-documented process. New product development can also be seen as an incremental, complex, interactive and continuous process which has been applied in a variety of ways. A number of inferences are subsequently presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Currently available treatments for insulin-dependent diabetes mellitus are often inadequate in terms of both efficacy and patient compliance. Gene therapy offers the possibility of a novel and improved method by which exogenous insulin can be delivered to a patient. This was approached in the present study by constructing a novel insulin-secreting cell line. For the purposes of this work immortalized cell lines were used. Fibroblasts and pituitary cells were transfected with the human preproisinulin gene to create stable lines of proinsulin- and insulin-secreting cells. The effect of known β-cell secretagogues on these cells were investigated, and found mostly to have no stimulatory effect, although IBMX, arginine and ZnSO4 each increased the rate of secretion. Cyclosporin (CyA) is currently the immunosuppresant of choice for transplant recipients; the effect of this treatment on endogenous β-cell function was assessed both in vivo and in vitro. Therapeutic doses of CyA were found to reduce plasma insulin concentrations and to impair glucose tolerance. The effect of immunoisolation on insulin release by HIT T15 cells was also investigated. The presence of an alginate membrane was found to severely impair insulin release. For the first implantation of the insulin-secreting cells, the animal model selected was the athymic nude mouse. This animal is immunoincompetent, and hence the use of an immunosuppressive regimen is circumvented. Graft function was assessed by measurement of plasma human C peptide concentrations, using a highly specific assay. Intraperitoneal implantation of genetically manipulated insulin-secreting pituitary cells into nude mice subsequently treated with a large dose of streptozotocin (STZ) resulted in a significantly delayed onset of hyperglycaemia when compared to control animals. Consumption of a ZnSO4 solution was shown to increase human C peptide release by the implant. Ensuing studies in nude mice examined the efficacy of different implantation sites, and included histochemical examination of the tumours. Aldehyde fuchsin staining and immunocytochemical processing demonstrated the presence of insulin containing cells within the excised tissue. Following initial investigations in nude mice, implantation studies were performed in CyA-immunosuppressed normal and STZ-diabetic mice. Graft function was found to be less efficacious, possibly due to the subcutaneous implantation site, or to the immunosuppresive regimen. Histochemical and transmission electron microscopic analysis of the tumour-like cell clusters found at autopsy revealed necrosis of cells at the core, but essentially normal cell morphology, with dense secretory granules in peripheral cells. The thesis provides evidence that gene therapy offers a feasibly new approach to insulin delivery.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An investigation is carried out into the design of a small local computer network for eventual implementation on the University of Aston campus. Microprocessors are investigated as a possible choice for use as a node controller for reasons of cost and reliability. Since the network will be local, high speed lines of megabit order are proposed. After an introduction to several well known networks, various aspects of networks are discussed including packet switching, functions of a node and host-node protocol. Chapter three develops the network philosophy with an introduction to microprocessors. Various organisations of microprocessors into multicomputer and multiprocessor systems are discussed, together with methods of achieving reliabls computing. Chapter four presents the simulation model and its implentation as a computer program. The major modelling effort is to study the behaviour of messages queueing for access to the network and the message delay experienced on the network. Use is made of spectral analysis to determine the sampling frequency while Sxponentially Weighted Noving Averages are used for data smoothing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis proposes that despite many experimental studies of thinking, and the development of models of thinking, such as Bruner's (1966) enactive, iconic and symbolic developmental modes, the imagery and inner verbal strategies used by children need further investigation to establish a coherent, theoretical basis from which to create experimental curricula for direct improvement of those strategies. Five hundred and twenty-three first, second and third year comprehensive school children were tested on 'recall' imagery, using a modified Betts Imagery Test; and a test of dual-coding processes (Paivio, 1971, p.179), by the P/W Visual/Verbal Questionnaire, measuring 'applied imagery' and inner verbalising. Three lines of investigation were pursued: 1. An investigation a. of hypothetical representational strategy differences between boys and girls; and b. the extent to which strategies change with increasing age. 2. The second and third year children's use of representational processes, were taken separately and compared with performance measures of perception, field independence, creativity, self-sufficiency and self-concept. 3. The second and third year children were categorised into four dual-coding strategy groups: a. High Visual/High Verbal b. Low Visual/High Verbal c. High Visual/Low Verbal d. Low Visual/Low Verbal These groups were compared on the same performance measures. The main result indicates that: 1. A hierarchy of dual-coding strategy use can be identified that is significantly related (.01, Binomial Test) to success or failure in the performance measures: the High Visual/High Verbal group registering the highest scores, the Low Visual/High Verbal and High Visual/Low Verbal groups registering intermediate scores, and the Low Visual/Low Verbal group registering the lowest scores on the performance measures. Subsidiary results indicate that: 2. Boys' use of visual strategies declines, and of verbal strategies increases, with age; girls' recall imagery strategy increases with age. Educational implications from the main result are discussed, the establishment of experimental curricula proposed, and further research suggested.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The reliability of the printed circuit board assembly under dynamic environments, such as those found onboard airplanes, ships and land vehicles is receiving more attention. This research analyses the dynamic characteristics of the printed circuit board (PCB) supported by edge retainers and plug-in connectors. By modelling the wedge retainer and connector as providing simply supported boundary condition with appropriate rotational spring stiffnesses along their respective edges with the aid of finite element codes, accurate natural frequencies for the board against experimental natural frequencies are obtained. For a PCB supported by two opposite wedge retainers and a plug-in connector and with its remaining edge free of any restraint, it is found that these real supports behave somewhere between the simply supported and clamped boundary conditions and provide a percentage fixity of 39.5% more than the classical simply supported case. By using an eigensensitivity method, the rotational stiffnesses representing the boundary supports of the PCB can be updated effectively and is capable of representing the dynamics of the PCB accurately. The result shows that the percentage error in the fundamental frequency of the PCB finite element model is substantially reduced from 22.3% to 1.3%. The procedure demonstrated the effectiveness of using only the vibration test frequencies as reference data when the mode shapes of the original untuned model are almost identical to the referenced modes/experimental data. When using only modal frequencies in model improvement, the analysis is very much simplified. Furthermore, the time taken to obtain the experimental data will be substantially reduced as the experimental mode shapes are not required.In addition, this thesis advocates a relatively simple method in determining the support locations for maximising the fundamental frequency of vibrating structures. The technique is simple and does not require any optimisation or sequential search algorithm in the analysis. The key to the procedure is to position the necessary supports at positions so as to eliminate the lower modes from the original configuration. This is accomplished by introducing point supports along the nodal lines of the highest possible mode from the original configuration, so that all the other lower modes are eliminated by the introduction of the new or extra supports to the structure. It also proposes inspecting the average driving point residues along the nodal lines of vibrating plates to find the optimal locations of the supports. Numerical examples are provided to demonstrate its validity. By applying to the PCB supported on its three sides by two wedge retainers and a connector, it is found that a single point constraint that would yield maximum fundamental frequency is located at the mid-point of the nodal line, namely, node 39. This point support has the effect of increasing the structure's fundamental frequency from 68.4 Hz to 146.9 Hz, or 115% higher.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cystic fibrosis (CF) is the most common lethal inherited disease among Caucasians and arises due to mutations in a chloride channel, called cystic fibrosis transmembrane conductance regulator. A hallmark of this disease is the chronic bacterial infection of the airways, which is usually, associated with pathogens such as Pseudomonas aeruginosa, S. aureus and recently becoming more prominent, B. cepacia. The excessive inflammatory response, which leads to irreversible lung damage, will in the long term lead to mortality of the patient at around the age of 40 years. Understanding the pathogenesis of CF currently relies on animal models, such as those employing genetically-modified mice, and on single cell culture models, which are grown either as polarised or non-polarised epithelium in vitro. Whilst these approaches partially enable the study of disease progression in CF, both types of models have inherent limitations. The overall aim of this thesis was to establish a multicellular co-culture model of normal and CF human airways in vitro, which helps to partially overcome these limitations and permits analysis of cell-to-cell communication in the airways. These models could then be used to examine the co-ordinated response of the airways to infection with relevant pathogens in order to validate this approach over animals/single cell models. Therefore epithelial cell lines of non-CF and CF background were employed in a co-culture model together with human pulmonary fibroblasts. Co-cultures were grown on collagen-coated permeable supports at air-liquid interface to promote epithelial cell differentiation. The models were characterised and essential features for investigating CF infections and inflammatory responses were investigated and analysed. A pseudostratified like epithelial cell layer was established at air liquid interface (ALI) of mono-and co-cultures and cell layer integrity was verified by tight junction (TJ) staining and transepithelial resistance measurements (TER). Mono- and co-cultures were also found to secrete the airway mucin MUC5AC. Influence of bacterial infections was found to be most challenging when intact S. aureus, B. cepacia and P. aeruginosa were used. CF mono- and co-cultures were found to mimic the hyperinflammatory state found in CF, which was confirmed by analysing IL-8 secretions of these models. These co-culture models will help to elucidate the role fibroblasts play in the inflammatory response to bacteria and will provide a useful testing platform to further investigate the dysregulated airway responses seen in CF.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

While much has been discussed about the relationship between ownership and financial performance of banks in emerging markets, literature about cross-ownership differences in credit market behaviour of banks in emerging economies is sparse. Using a portfolio choice model and bank-level data from India for 9 years (1995–96 to 2003–04), we examine banks’ behaviour in the context of credit markets of an emerging market economy. Our results indicate that, in India, the data for the domestic banks fit well the aforementioned portfolio-choice model, especially for private banks, but the model cannot explain the behaviour of foreign banks. In general, allocation of assets between risk-free government securities and risky credit is affected by past allocation patterns, stock exchange listing (for private banks), risk averseness of banks, regulations regarding treatment of NPA, and ability of banks to recover doubtful credit. It is also evident that banks deal with changing levels of systematic risk by altering the ratio of securitized to non-securitized credit.