31 resultados para New approaches
em Aston University Research Archive
Resumo:
The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.
Resumo:
The use of immunological adjuvants has been established since 1924 and ever since many candidates have been extensively researched in vaccine development. The controlled release of vaccine is another area of biotechnology research, which is advancing rapidly with great potential and success. Encapsulation of peptide and protein drugs within biodegradable microspheres has been amongst the most successful of approaches within the past decade. The present studies have focused on combining the advantages of microsphere delivery systems composed of biodegradable polylactide (PLLA) and polylactide-co-glycolide (PLGA) polymers with that of safe and effective adjuvants. The research efforts were directed to the development of single-dose delivery vehicles which, can be manufactured easily, safely, under mild and favourable conditions to the encapsulated antigens. In pursuing this objective non ionic block copolymers (NIBCs) (Pluronics@ LI01 and L121) were incorporated within poly-dl-lactide (PDLA) micorospheres prepared with emulsification-diffusion method. LI0I and L121 served both as adjuvants and stabilising agents within these vaccine delivery vehicles. These formulations encapsulating the model antigens lysozyme, ovalbumin (OVA) and diphtheria toxoid (DT) resulted in high entrapment efficiency (99%), yield (96.7%) and elicited high and sustained immune response (IgG titres up to 9427) after one single administration over nine months. The structural integrity of the antigens was preserved within these formulations. In evaluating new approaches for the use of well-established adjuvants such as alum, these particles were incorporated within PLLA and PLGA microspheres at much lesser quantities (5-10 times lower) than those contained within conventional alum-adsorbed vaccines. These studies focused on the incorporation of the clinically relevant tetanus toxoid (TT) antigen within biodegradable microspheres. The encapsulation of both alum particles and TT antigen within these micropheres resulted in preparations with high encapsulation efficiency (95%) and yield (91.2%). The immune response to these particles was also investigated to evaluate the secretion of serum IgG, IgG1, IgG2a and IgG2b after a single administration of these vaccines. The Splenic cells proliferation was also investigated as an indication for the induction of cell mediated immunity. These particles resulted in high and sustained immune response over a period of 14 months. The stability of TT within particles was also investigated under dry storage over a period of several months. NIBC microspheres were also investigated as potential DNA vaccine delivery systems using hepatitis B plasmid. These particles resulted in micro spheres of 3-5 μm diameter and were shown to preserve the integrity of the encapsulated (27.7% entrapment efficiency) hepatitis B plasmid.
Resumo:
Whilst most of the literature focusing on the Korean peninsula has concentrated on how to achieve unification through confidence-building measures, dialogues, negotiation and diplomacy, little attention has been paid to how a unified Korean identity, a core component of any potential reunification scheme could develop and be sustained. The paper addresses this gap by: (1) defining what national identity is, and how Korean identities have been formed, (2) outlining how both South and North Korea have understood and used the concept of national identity, (3) suggesting possible grounds on which the two Koreas could build a new, common national identity. © 2014 The Regents of the University of California.
Resumo:
Although atypical social behaviour remains a key characterisation of ASD, the presence ofsensory and perceptual abnormalities has been given a more central role in recentclassification changes. An understanding of the origins of such aberrations could thus prove afruitful focus for ASD research. Early neurocognitive models of ASD suggested that thestudy of high frequency activity in the brain as a measure of cortical connectivity mightprovide the key to understanding the neural correlates of sensory and perceptual deviations inASD. As our review shows, the findings from subsequent research have been inconsistent,with a lack of agreement about the nature of any high frequency disturbances in ASD brains.Based on the application of new techniques using more sophisticated measures of brainsynchronisation, direction of information flow, and invoking the coupling between high andlow frequency bands, we propose a framework which could reconcile apparently conflictingfindings in this area and would be consistent both with emerging neurocognitive models ofautism and with the heterogeneity of the condition.
Resumo:
Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.
Resumo:
An increasing interest in “bringing actors back in” and gaining a nuanced understanding of their actions and interactions across a variety of strands in the management literature, has recently helped ethnography to unknown prominence in the field of organizational studies. Yet, calls remain that ethnography should “play a much more central role in the organization and management studies repertoire than it currently does” (Watson, 2011: 202). Ironically, those organizational realities that ethnographers are called to examine have at the same time become less and less amenable to ethnographic study. In this paper, we respond to these calls for innovative ethnographic methods in two ways. First, we report on the practices and ethnographic experiences of conducting a year-long team-based video ethnography of reinsurance trading in Lloyd’s of London. Second, drawing on these experiences, we propose an initial framework for systematizing new approaches to organizational ethnography and visualizing the ways in which they are ‘expanding’ ethnography as it was traditionally practiced.
Resumo:
The multivariable and progressive natural history of type 2 diabetes limits the effectiveness of available glucose-lowering drugs. Constraints imposed by comorbidities (notably cardiovascular disease and renal impairment) and the need to avoid hypoglycaemia, weight gain, and drug interactions further complicate the treatment process. These challenges have prompted the development of new formulations and delivery methods for existing drugs alongside research into novel pharmacological entities. Advances in incretin-based therapies include a miniature implantable osmotic pump to give continuous delivery of a glucagon-like peptide-1 receptor agonist for 6-12 months and once-weekly tablets of dipeptidyl peptidase-4 inhibitors. Hybrid molecules that combine the properties of selected incretins and other peptides are at early stages of development, and proof of concept has been shown for small non-peptide molecules to activate glucagon-like peptide-1 receptors. Additional sodium-glucose co-transporter inhibitors are progressing in development as well as possible new insulin-releasing biological agents and small-molecule inhibitors of glucagon action. Adiponectin receptor agonists, selective peroxisome proliferator-activated receptor modulators, cellular glucocorticoid inhibitors, and analogues of fibroblast growth factor 21 are being considered as potential new approaches to glucose lowering. Compounds that can enhance insulin receptor and post-receptor signalling cascades or directly promote selected pathways of glucose metabolism have suggested opportunities for future treatments. However, pharmacological interventions that are able to restore normal β-cell function and β-cell mass, normalise insulin action, and fully correct glucose homoeostasis are a distant vision.
Resumo:
Anyone who looks at the title of this special issue will agree that the intent behind the preparation of this volume was ambitious: to predict and discuss “The Future of Manufacturing”. Will manufacturing be important in the future? Even though some sceptics might say not, and put on the table some old familiar arguments, we would strongly disagree. To bring subsidies for the argument we issued the call-for-papers for this special issue of Journal of Manufacturing Technology Management, fully aware of the size of the challenge in our hands. But we strongly believed that the enterprise would be worthwhile. The point of departure is the ongoing debate concerning the meaning and content of manufacturing. The easily visualised internal activity of using tangible resources to make physical products in factories is no longer a viable way to characterise manufacturing. It is now a more loosely defined concept concerning the organisation and management of open, interdependent, systems for delivering goods and services, tangible and intangible, to diverse types of markets. Interestingly, Wickham Skinner is the most cited author in this special issue of JMTM. He provides the departure point of several articles because his vision and insights have guided and inspired researchers in production and operations management from the late 1960s until today. However, the picture that we draw after looking at the contributions in this special issue is intrinsically distinct, much more dynamic, and complex. Seven articles address the following research themes: 1.new patterns of organisation, where the boundaries of firms become blurred and the role of the firm in the production system as well as that of manufacturing within the firm become contingent; 2.new approaches to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface; 3.new challenges in strategic and operational decisions due to changes in the profile of the workforce; 4.new global players, especially China, modifying the manufacturing landscape; and 5.new techniques, methods and tools that are being made feasible through progress in new technological domains. Of course, many other important dimensions could be studied, but these themes are representative of current changes and future challenges. Three articles look at the first theme: organisational evolution of production and operations in firms and networks. Karlsson's and Skold's article represent one further step in their efforts to characterise “the extraprise”. In the article, they advance the construction of a new framework, based on “the network perspective” by defining the formal elements which compose it and exploring the meaning of different types of relationships. The way in which “actors, resources and activities” are conceptualised extends the existing boundaries of analytical thinking in operations management and open new avenues for research, teaching and practice. The higher level of abstraction, an intrinsic feature of the framework, is associated to the increasing degree of complexity that characterises decisions related to strategy and implementation in the manufacturing and operations area, a feature that is expected to become more and more pervasive as time proceeds. Riis, Johansen, Englyst and Sorensen have also based their article on their previous work, which in this case is on “the interactive firm”. They advance new propositions on strategic roles of manufacturing and discuss why the configuration of strategic manufacturing roles, at the level of the network, will become a key issue and how the indirect strategic roles of manufacturing will become increasingly important. Additionally, by considering that value chains will become value webs, they predict that shifts in strategic manufacturing roles will look like a sequence of moves similar to a game of chess. Then, lastly under the first theme, Fleury and Fleury develop a conceptual framework for the study of production systems in general derived from field research in the telecommunications industry, here considered a prototype of the coming information society and knowledge economy. They propose a new typology of firms which, on certain dimensions, complements the propositions found in the other two articles. Their telecoms-based framework (TbF) comprises six types of companies characterised by distinct profiles of organisational competences, which interact according to specific patterns of relationships, thus creating distinct configurations of production networks. The second theme is addressed by Kyläheiko and SandstroÍm in their article “Strategic options based framework for management of dynamic capabilities in manufacturing firms”. They propose a new approach to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface. Their framework for a manufacturing firm in the digital age leads to active asset selection (strategic investments in both tangible and intangible assets) and efficient orchestrating of the global value net in “thin” intangible asset markets. The framework consists of five steps based on Porter's five-forces model, the resources-based view, complemented by means of the concepts of strategic options and related flexibility issues. Thun, GroÍssler and Miczka's contribution to the third theme brings the human dimension to the debate regarding the future of manufacturing. Their article focuses on the challenges brought to management by the ageing of workers in Germany but, in the arguments that are raised, the future challenges associated to workers and work organisation in every production system become visible and relevant. An interesting point in the approach adopted by the authors is that not only the factual problems and solutions are taken into account but the perception of the managers is brought into the picture. China cannot be absent in the discussion of the future of manufacturing. Therefore, within the fourth theme, Vaidya, Bennett and Liu provide the evidence of the gradual improvement of Chinese companies in the medium and high-tech sectors, by using the revealed comparative advantage (RCA) analysis. The Chinese evolution is shown to be based on capabilities developed through combining international technology transfer and indigenous learning. The main implication for the Western companies is the need to take account of the accelerated rhythm of capability development in China. For other developing countries China's case provides lessons of great importance. Finally, under the fifth theme, Kuehnle's article: “Post mass production paradigm (PMPP) trajectories” provides a futuristic scenario of what is already around us and might become prevalent in the future. It takes a very intensive look at a whole set of dimensions that are affecting manufacturing now, and will influence manufacturing in the future, ranging from the application of ICT to the need for social transparency. In summary, this special issue of JMTM presents a brief, but undisputable, demonstration of the possible richness of manufacturing in the future. Indeed, we could even say that manufacturing has no future if we only stick to the past perspectives. Embracing the new is not easy. The new configurations of production systems, the distributed and complementary roles to be performed by distinct types of companies in diversified networked structures, leveraged by the new emergent technologies and associated the new challenges for managing people, are all themes that are carriers of the future. The Guest Editors of this special issue on the future of manufacturing are strongly convinced that their undertaking has been worthwhile.
Resumo:
At the beginning of the 80s new approaches to translation were emerging in such a way that, in the global context of postmodernism and poststructuralism, they provoked a reassessment of Translation Studies (TS), acknowledging ideologies as a relevant concept to TS and considering the political and visible role of the translator. This introduction aims to establish a basic theoretical framework in which we can develop an analysis of the ‘alterations’ that, consciously or unconsciously, translators have imposed on Le deuxième sexe (1949, Gallimard) by Simone de Beauvoir for the last fifty years. Furthermore, it is essential to examine the divergences of the censoring attitude adopted by the first male translators (Parshley, Palant and Milliet) who considered this text to be a sex manual, and the one adopted by more recent female translators (Martorell and Simons) who considered it to be a philosophical book on feminism. Nevertheless, despite this tendency to consider that translators are the only professionals responsible for the translation process, it is necessary to bear in mind the work carried out by the paratranslator, who is the real censor and ‘decider’ of the way a work is presented to the translation community. Paratranslators work with paratexts (also known as ‘analysis-spaces’), and this makes it possible to study the ideological adaptation that a cultural object undergoes when it is incorporated into a new culture and society (covers, volumes, tables of contents, titles, iconic or visual elements and so forth). In short, the analysis of the texts and paratexts of Le deuxième sexe, along with its subsequent translations and rewritings into Spanish, Portuguese and English, will help reveal the function of the censoring apparatus and demonstrate the essential role that –without exception– ideologies play in the professional work of translation and paratranslation, since they have a decisive influence on the reception of the cultural (and ideological) object, in both the society in which it is created and that in which it is received.
Resumo:
This paper explores how participants work in a new format of brainstorm, called an 'incubated gathering'. The paper explores brainstorm-type activities, where senior managers share occupational knowledge in the generation of a solution to a problem in which they have an embedded interest. The findings suggest that participants perceive great worth in the incubated gathering, and (compared to other formats of gathering) it allows participants to consider a significantly wider range of issues, and in more detail. They also suggest that an outcome from an incubated gathering will be informed by a more rich consideration of the pertinent issues than that from other formats of brainstorming. This paper substantiates these claims with evidence from a suite of new approaches to gauge the ability of participants to share knowledge during group brainstorming-type activities.
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.
Resumo:
Reflecting changes in the nature of governance, some have questioned whether Public Administration is now an historical anachronism. While a legitimate debate exists between sceptics and optimists, this special issue demonstrates grounds for optimism by indicating the continuing diversity and adaptability of the field of Public Administration. In this introduction, we first sketch the variety of intellectual traditions which comprise the field of modern Public Administration. We then consider institutional challenges facing the subject given considerable pressures towards disciplinary fragmentation, and ideological challenges arising from a new distrust of public provision in the UK. Despite these challenges, Public Administration continues to provide a framework to analyse the practice of government and governance, governing institutions and traditions, and their wider sociological context. It can also directly inform policy reform - even if this endeavour can have its own pitfalls and pratfalls for the 'engaged' academic. We further suggest that, rather than lacking theoretical rigour, new approaches are developing that recognise the structural and political nature of the determinants of public administration. Finally, we highlight the richness of modern comparative work in Public Administration. Researchers can usefully look beyond the Atlantic relationship for theoretical enhancement and also consider more seriously the recursive and complex nature of international pressures on public administration. © The Author(s) 2012 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Resumo:
After decades of slow progress, the pace of research on membrane protein structures is beginning to quicken thanks to various improvements in technology, including protein engineering and microfocus X-ray diffraction. Here we review these developments and, where possible, highlight generic new approaches to solving membrane protein structures based on recent technological advances. Rational approaches to overcoming the bottlenecks in the field are urgently required as membrane proteins, which typically comprise ~30% of the proteomes of organisms, are dramatically under-represented in the structural database of the Protein Data Bank.
Resumo:
Objectives: Are behavioural interventions effective in reducing the rate of sexually transmitted infections (STIs) among genitourinary medicine (GUM) clinic patients? Design: Systematic review and meta-analysis of published articles. Data sources: Medline, CINAHL, Embase, PsychINFO, Applied Social Sciences Index and Abstracts, Cochrane Library Controlled Clinical Trials Register, National Research Register (1966 to January 2004). Review methods: Randomised controlled trials of behavioural interventions in sexual health clinic patients were included if they reported change to STI rates or self reported sexual behaviour. Trial quality was assessed using the Jadad score and results pooled using random effects meta-analyses where outcomes were consistent across studies. Results: 14 trials were included; 12 based in the United States. Experimental interventions were heterogeneous and most control interventions were more structured than typical UK care. Eight trials reported data on laboratory confirmed infections, of which four observed a greater reduction in their intervention groups (in two cases this result was statistically significant, p<0.05). Seven trials reported consistent condom use, of which six observed a greater increase among their intervention subjects. Results for other measures of sexual behaviour were inconsistent. Success in reducing STIs was related to trial quality, use of social cognition models, and formative research in the target population. However, effectiveness was not related to intervention format or length. Conclusions: While results were heterogeneous, several trials observed reductions in STI rates. The most effective interventions were developed through extensive formative research. These findings should encourage further research in the United Kingdom where new approaches to preventing STIs are urgently required.