784 resultados para accommodate
Resumo:
This thesis explores the strategic positioning [SP] activities of charitable organizations [COs] within the wider sector of voluntary and non-profit organizations [VNPOs] in the UK. Despite the growing interest in SP for British COs in an increasingly competitive operating environment and changing policy context, there is lack of research in mainstream marketing/strategic management studies on this topic for charities, whilst the specialist literature on VNPOs has neglected the study of SP. The thesis begins with an extended literature review of the concept of positioning in both commercial [for-profit] and charitable organizations. It concludes that the majority of theoretical underpinnings of SP that are prescribed for COs have been derived from the commercial strategy/marketing literature. There is currently a lack of theoretical and conceptual models that can accommodate the particular context of COs and guide strategic positioning practice in them. The research contained in this thesis is intended to fill some of these research gaps. It combines an exploratory postal survey and four cross-sectional case studies to describe the SP activities of a sample of general welfare and social care charities and identifies the key factors that influence their choice of positioning strategies [PSs]. It concludes that charitable organizations have begun to undertake SP to differentiate their organizations from other charities that provide similar services. Their PSs have both generic features, and other characteristics that are unique to them. A combination of external environmental and organizational factors influences their choice of PSs. A theoretical model, which depicts these factors, is developed in this research. It highlights the role of governmental influence, other external environmental forces, the charity’s mission, organizational resources, and influential stakeholders in shaping the charity’s PS. This study concludes by considering the theoretical and managerial implications of the findings on the study of charitable and non-profit organizations.
Resumo:
Psoriasis is characterised by epidermal proliferation and inflammation resulting in the appearance of elevated erythematous plaques. The ratio of c~AMP/c~GMP is decreased in psoriatic skin and when the epidermal cell surface receptors are stimulated by β-adrenergic agonists, intracellular ATP is transformed into c-AMP, thus restoring the c~AMP/c~GMP levels. This thesis describes a series of β-adrenoceptor agonists for topical delivery based upon the soft-drug approach. Soft drugs are defined as biologically active, therapeutically useful chemical compounds (drugs) characterised by a predictable and controllable In vivo destruction (metabolism) to non-toxic moieties. after they achieve their therapeutic role, The N-substituent can accommodate a broad range of structures and here the alkoxycarbonylethyl group has been used to provide metabolic susceptability. The increased polarity of the dihydroxy acid, expected after metabolic conversion of the soft~drug, ethyl N-[2'-(3',4'-dihydroxyphenyl)-2'-hydroxyethyl]-3- aminopropionate, should eliminate agonist activity. Further. to prevent oxidation and enhance topical delivery, the catechol hydroxyl groups have been esterified to produce a pro-soft-drug which generates the soft-drug in enzymic systems. The chemical hydrolysis of the pro-soft-drug proceeded via the formation of the dlpivaloyloxy acid and it failed to generate the active dihydroxy ester soft-drug. In contrast, in the presence of porcine liver carboxyesterase, the hydrolysis of the pro-soft drug proceeded via the formation of the required active soft-drug. This compound, thus, has the appropnate kinetic features to enable it to be evaluated further as a drug for the treatment of psoriasis. The pH rate-profile for the hydrolysis of soft-drug indicated a maximum stability at pH ∼ 4.0. The individual rate constants for the degradation and the pKa were analysed by nonlinear regression. The pKa of 7.40 is in excellent agreement with that determined by direct titration (7.43) and indicates that satisfactory convergence was achieved. The soft-drug was poorly transported across a silicone membrane; it was also air-sensitive due to oxidation of the catechol group. The transport of the pro-soft-drug was more efficient and, over the donor pH range 3-8, increased with pH. At lower values, the largely protonated species was not transported. However, above pH 7. chemical degradation was rapid so that a donor pH of 5-6 was optimum. The β-adrenergic agonist activity of these compounds was tested in vitro by measuring chronotropic and inotropic responses in the guinea pig atria and relaxation of guinea pig trachea precontracted with acetylcholine (10-3 M). The soft~drug was a full agonist on the tracheal preparation but was less potent than isoprenaline. Responses of the soft~drug were competitively antagonised by propranolol (10-6 M). The soft~drug produced an increase in force and rate of the isolated atrial preparatIon. The propyl analogue was equally potent with ED50 of 6.52 x 10-7 M. In contrast, at equivalent doses, the dihydroxy acid showed no activity; only a marginal effect was observed on the tracheal preparation. For the pro~soft-drug, responses were of slow onset, in both preparations, with a slowly developing relaxatlon of the tracheal preparatlon at high concentrations (10-5 M). This is consistent with in vitro results where the dipivaloyl groups are hydrolysed more readily than the ethyl ester to gIve the active soft-drug. These results confirm the validity tif the pro-soft-drug approach to the deUvery of β-adrenoceptor agonists.
Resumo:
Rotating fluidised Beds offer the potential for high intensity combustion, large turndown and extended range of fluidising velocity due to the imposition of an artificial gravitational field. Low thermal capacity should also allow rapid response to load changes. This thesis describes investigations of the validity of these potential virtues. Experiments, at atmospheric pressure, were conducted in flow visualisation rigs and a combustor designed to accommodate a distributor 200mm diameter and 80mm axial length. Ancillary experiments were conducted in a 6" diameter conventional fluidised bed. The investigations encompassed assessment of; fluidisation and elutriation, coal feed requirements, start-up and steady-state combustion using premixed propane and air, transition from propane to coal combustion and mechanical design. Assessments were made of an elutriation model and some effects of particle size on the combustion of premixed fuel gas and air. The findings were: a) more reliable start-up and control methods must be developed. Combustion of premixed propane and air led to severe mechanical and operating problems. Manual control of coal combustion was inadequate. b) Design criteria must encompass pressure loss, mechanical strength and high temperature resistance. The flow characteristics of ancillaries and the distributor must be matcheo. c) Fluidisation of a range of particle sizes was investigated. New correlations for minimum fluidisation and fully supported velocities are proposed. Some effects on elutriation of particle size and the distance between the bed surface and exhaust port have been identified. A conic distributor did not aid initial bed distribution. Furthermore, airflow instability was encountered with this distributor shape. Future use of conic distributors is not recommended. Axial solids mixing was found to be poor. A coal feeder was developed which produced uniform fuel distribution throughout the bed. The report concludes that small scale inhibits development of mechanical design and exploration of performance. future research requires larger combustors and automatic control.
Resumo:
Class II Major Histocompatibility Complex (MHC) molecules have an open-ended binding groove which can accommodate peptides of varying lengths. Several studies have demonstrated that peptide flanking residues (PFRs) which lie outside the core binding groove can influence peptide binding and T cell recognition. By using data from the AntiJen database we were able to characterise systematically the influence of PFRs on peptide affinity for MHC class II molecules.
Resumo:
The importance of non-technical factors in the design and implementation of information systems has been increasingly recognised by both researchers and practitioners, and recent literature highlights the need for new tools and techniques with an organisational, rather than technical, focus. The gap between what is technically possible and what is generally practised, is particularly wide in the sales and marketing field. This research describes the design and implementation of a decision support system (DSS) for marketing planning and control in a small, but complex company and examines the nature of the difficulties encountered. An intermediary with functional, rather than technical, expertise is used as a strategy for overcoming these by taking control of the whole of the systems design and implementation cycle. Given the practical nature of the research, an action research approach is adopted with the researcher undertaking this role. This approach provides a detailed case study of what actually happens during the DSS development cycle, allowing the influence of organisational factors to be captured. The findings of the research show how the main focus of the intermediary's role needs to be adapted over the systems development cycle; from coordination and liaison in the pre-design and design stages, to systems champion during the first part of the implementation stage, and finally to catalyst to ensure that the DSS is integrated into the decision-making process. Two practical marketing exercises are undertaken which illustrate the nature of the gap between the provision of information and its use. The lack of a formal approach to planning and control is shown to have a significant effect on the way the DSS is used and the role of the intermediary is extended successfully to accommodate this factor. This leads to the conclusion that for the DSS to play a fully effective role, small firms may need to introduce more structure into their marketing planning, and that the role of the intermediary, or Information Coordinator, should include the responsibility for introducing new techniques and ideas to aid with this.
Resumo:
This thesis is a study of low-dimensional visualisation methods for data visualisation under certainty of the input data. It focuses on the two main feed-forward neural network algorithms which are NeuroScale and Generative Topographic Mapping (GTM) by trying to make both algorithms able to accommodate the uncertainty. The two models are shown not to work well under high levels of noise within the data and need to be modified. The modification of both models, NeuroScale and GTM, are verified by using synthetic data to show their ability to accommodate the noise. The thesis is interested in the controversy surrounding the non-uniqueness of predictive gene lists (PGL) of predicting prognosis outcome of breast cancer patients as available in DNA microarray experiments. Many of these studies have ignored the uncertainty issue resulting in random correlations of sparse model selection in high dimensional spaces. The visualisation techniques are used to confirm that the patients involved in such medical studies are intrinsically unclassifiable on the basis of provided PGL evidence. This additional category of ‘unclassifiable’ should be accommodated within medical decision support systems if serious errors and unnecessary adjuvant therapy are to be avoided.
Resumo:
Most contemporary models of spatial vision include a cross-oriented route to suppression (masking from a broadly tuned inhibitory pool), which is most potent at low spatial and high temporal frequencies (T. S. Meese & D. J. Holmes, 2007). The influence of this pathway can elevate orientation-masking functions without exciting the target mechanism, and because early psychophysical estimates of filter bandwidth did not accommodate this, it is likely that they have been overestimated for this corner of stimulus space. Here we show that a transient 40% contrast mask causes substantial binocular threshold elevation for a transient vertical target, and this declines from a mask orientation of 0° to about 40° (indicating tuning), and then more gently to 90°, where it remains at a factor of ∼4. We also confirm that cross-orientation masking is diminished or abolished at high spatial frequencies and for sustained temporal modulation. We fitted a simple model of pedestal masking and cross-orientation suppression (XOS) to our data and those of G. C. Phillips and H. R. Wilson (1984) and found the dependency of orientation bandwidth on spatial frequency to be much less than previously supposed. An extension of our linear spatial pooling model of contrast gain control and dilution masking (T. S. Meese & R. J. Summers, 2007) is also shown to be consistent with our results using filter bandwidths of ±20°. Both models include tightly and broadly tuned components of divisive suppression. More generally, because XOS and/or dilution masking can affect the shape of orientation-masking curves, we caution that variations in bandwidth estimates might reflect variations in processes that have nothing to do with filter bandwidth.
Resumo:
In order to generate sales promotion response predictions, marketing analysts estimate demand models using either disaggregated (consumer-level) or aggregated (store-level) scanner data. Comparison of predictions from these demand models is complicated by the fact that models may accommodate different forms of consumer heterogeneity depending on the level of data aggregation. This study shows via simulation that demand models with various heterogeneity specifications do not produce more accurate sales response predictions than a homogeneous demand model applied to store-level data, with one major exception: a random coefficients model designed to capture within-store heterogeneity using store-level data produced significantly more accurate sales response predictions (as well as better fit) compared to other model specifications. An empirical application to the paper towel product category adds additional insights. This article has supplementary material online.
Cross-orientation masking is speed invariant between ocular pathways but speed dependent within them
Resumo:
In human (D. H. Baker, T. S. Meese, & R. J. Summers, 2007b) and in cat (B. Li, M. R. Peterson, J. K. Thompson, T. Duong, & R. D. Freeman, 2005; F. Sengpiel & V. Vorobyov, 2005) there are at least two routes to cross-orientation suppression (XOS): a broadband, non-adaptable, monocular (within-eye) pathway and a more narrowband, adaptable interocular (between the eyes) pathway. We further characterized these two routes psychophysically by measuring the weight of suppression across spatio-temporal frequency for cross-oriented pairs of superimposed flickering Gabor patches. Masking functions were normalized to unmasked detection thresholds and fitted by a two-stage model of contrast gain control (T. S. Meese, M. A. Georgeson, & D. H. Baker, 2006) that was developed to accommodate XOS. The weight of monocular suppression was a power function of the scalar quantity ‘speed’ (temporal-frequency/spatial-frequency). This weight can be expressed as the ratio of non-oriented magno- and parvo-like mechanisms, permitting a fast-acting, early locus, as befits the urgency for action associated with high retinal speeds. In contrast, dichoptic-masking functions superimposed. Overall, this (i) provides further evidence for dissociation between the two forms of XOS in humans, and (ii) indicates that the monocular and interocular varieties of XOS are space/time scale-dependent and scale-invariant, respectively. This suggests an image-processing role for interocular XOS that is tailored to natural image statistics—very different from that of the scale-dependent (speed-dependent) monocular variety.
Resumo:
In this letter, we directly compare digital back-propagation (DBP) with spectral inversion (SI) both with and without symmetry correction via dispersive chirping, and numerically demonstrate that predispersed SI outperforms traditional SI, and approaches the performance of computationally exhaustive ideal DBP. Furthermore, we propose for the first time a novel practical scheme employing predispersed SI to compensate the bulk of channel nonlinearities, and DBP to accommodate the residual penalties due to varying SI location, with predispersed SI ubiquitously employed along the transmission link with <;0.5-dB penalty. Our results also show that predispersed SI enables partial compensation of cross-phase modulation effects, increasing the transmission reach by ×2.
Resumo:
This paper extends existing understandings of how actors' constructions of ambiguity shape the emergent process of strategic action. We theoretically elaborate the role of rhetoric in exploiting strategic ambiguity, based on analysis of a longitudinal case study of an internationalization strategy within a business school. Our data show that actors use rhetoric to construct three types of strategic ambiguity: protective ambiguity that appeals to common values in order to protect particular interests, invitational ambiguity that appeals to common values in order to invite participation in particular actions, and adaptive ambiguity that enables the temporary adoption of specific values in order to appeal to a particular audience at one point in time. These rhetorical constructions of ambiguity follow a processual pattern that shapes the emergent process of strategic action. Our findings show that (1) the strategic actions that emerge are shaped by the way actors construct and exploit ambiguity, (2) the ambiguity intrinsic to the action is analytically distinct from ambiguity that is constructed and exploited by actors, and (3) ambiguity construction shifts over time to accommodate the emerging pattern of actions.
Resumo:
Are persistent marketing effects most likely to appear right after the introduction of a product? The authors give an affirmative answer to this question by developing a model that explicitly reports how persistent and transient marketing effects evolve over time. The proposed model provides managers with a valuable tool to evaluate their allocation of marketing expenditures over time. An application of the model to many pharmaceutical products, estimated through (exact initial) Kalman filtering, indicates that both persistent and transient effects occur predominantly immediately after a brand's introduction. Subsequently, the size of the effects declines. The authors theoretically and empirically compare their methodology with methodology based on unit root testing and demonstrate that the need for unit root tests creates difficulties in applying conventional persistence modeling. The authors recommend that marketing models should either accommodate persistent effects that change over time or be applied to mature brands or limited time windows only.
Resumo:
Innovation is vital if organisations are to deal effectively with social and economic change. Yet few studies have looked at the relationship between teamworking and innovation – or, indeed, other organisational outcomes. Our research aimed to fill this gap by exploring the extent to which team-based working in small- and medium- sized manufacturing organisations predicted product innovation. The results show that levels of innovation are higher in organisations using work-based teams than in those with alternative structural arrangements. We also found that effective HRM practices, such as sophisticated selection, induction, appraisal, training and remuneration management, created an environment that allowed teams to excel. The study drew on a variety of sources, including data on organisational-level innovation gathered through a postal survey. Respondents gave estimates of the number of new or adapted products developed in the past two years. They also detailed the percentage of production workers involved in making the new products; sales turnover accounted for by these products; and how far production processes had been changed to accommodate the innovations. We measured HRM effectiveness and the extent of teamworking via interviews with the relevant HR or production manager. We then rated each organisation on a scale of one to five, according to how effective its HRM practices were. We also examined the percentage of staff at management and shopfloor levels engaged in teamworking. The research design was longitudinal, in that the data on product innovation was collected six months to a year after the main questionnaire on teamworking was conducted. Other studies addressing these questions have tended to be cross-sectional, measuring both variables at the same time. Longitudinal studies generally make a stronger case for causality. Perhaps of most theoretical significance is the finding that teamworking combined with effective HR systems explains more of the variance for product innovation than teamworking alone. This is in line with J Richard Hackman (1990), who argued that organisational context affected team performance in various ways – for example, through offering a framework for the administration of reward and the exchange of knowledge and through promoting learning-oriented beliefs. Our work supports these ideas. This study also has practical implications. Increasing the number of teams may be an important step in determining the extent to which they can innovate on a sustained basis. Organisations should therefore consider what HRM practices are most likely to foster team innovation. They might, for example, explore how helpful it would be to develop team-based appraisal and better designed teamworking training. Developing support structures that enable teams to achieve outstanding performance may present a challenge, but our results suggest that such an approach will be worth the effort. Key points: • The greater the percentage of staff working in teams, the higher the level of innovation. • This applies to both management and production teams. • Where sophisticated and effective HRM practices are in place, the relationship between team-based working and product innovation becomes more pronounced. • Both cross-sectional and longitudinal analyses show strong relationships between team-based working and product innovation.
Resumo:
Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.
Direct measurement of coherency limits for strain relaxation in heteroepitaxial core/shell nanowires
Resumo:
The growth of heteroepitaxially strained semiconductors at the nanoscale enables tailoring of material properties for enhanced device performance. For core/shell nanowires (NWs), theoretical predictions of the coherency limits and the implications they carry remain uncertain without proper identification of the mechanisms by which strains relax. We present here for the Ge/Si core/shell NW system the first experimental measurement of critical shell thickness for strain relaxation in a semiconductor NW heterostructure and the identification of the relaxation mechanisms. Axial and tangential strain relief is initiated by the formation of periodic a/2 〈110〉 perfect dislocations via nucleation and glide on {111} slip-planes. Glide of dislocation segments is directly confirmed by real-time in situ transmission electron microscope observations and by dislocation dynamics simulations. Further shell growth leads to roughening and grain formation which provides additional strain relief. As a consequence of core/shell strain sharing in NWs, a 16 nm radius Ge NW with a 3 nm Si shell is shown to accommodate 3% coherent strain at equilibrium, a factor of 3 increase over the 1 nm equilibrium critical thickness for planar Si/Ge heteroepitaxial growth. © 2012 American Chemical Society.