15 resultados para New paradigm
em Aston University Research Archive
Resumo:
The WDM properties of dispersion managed (DM) solitons and the reduction in Gordon-Haus jitter means that it is possible to contemplate multiple channels each at 10 Gbit/s for transoceanic distances without the need for elaborate soliton control. This paper will concentrate on fundamental principles of DM solitons, but will use these principles to indicate optimum maps for future high-speed soliton systems.
Resumo:
The WDM properties of dispersion managed (DM) solitons and the reduction in Gordon-Haus jitter means that it is possible to contemplate multiple channels each at 10 Gbit/s for transoceanic distances without the need for elaborate soliton control. This paper will concentrate on fundamental principles of DM solitons, but will use these principles to indicate optimum maps for future high-speed soliton systems.
Resumo:
Pre-eclampsia is a vascular disorder of pregnancy where anti-angiogenic factors, systemic inflammation and oxidative stress predominate, but none can claim to cause pre-eclampsia. This review provides an alternative to the 'two-stage model' of pre-eclampsia in which abnormal spiral arteries modification leads to placental hypoxia, oxidative stress and aberrant maternal systemic inflammation. Very high maternal soluble fms-like tyrosine kinase-1 (sFlt-1 also known as sVEGFR) and very low placenta growth factor (PlGF) are unique to pre-eclampsia; however, abnormal spiral arteries and excessive inflammation are also prevalent in other placental disorders. Metaphorically speaking, pregnancy can be viewed as a car with an accelerator and brakes, where inflammation, oxidative stress and an imbalance in the angiogenic milieu act as the 'accelerator'. The 'braking system' includes the protective pathways of haem oxygenase 1 (also referred as Hmox1 or HO-1) and cystathionine-γ-lyase (also known as CSE or Cth), which generate carbon monoxide (CO) and hydrogen sulphide (H2S) respectively. The failure in these pathways (brakes) results in the pregnancy going out of control and the system crashing. Put simply, pre-eclampsia is an accelerator-brake defect disorder. CO and H2S hold great promise because of their unique ability to suppress the anti-angiogenic factors sFlt-1 and soluble endoglin as well as to promote PlGF and endothelial NOS activity. The key to finding a cure lies in the identification of cheap, safe and effective drugs that induce the braking system to keep the pregnancy vehicle on track past the finishing line.
Resumo:
This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.
Resumo:
Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.
Resumo:
This paper proposes a novel framework of incorporating protein-protein interactions (PPI) ontology knowledge into PPI extraction from biomedical literature in order to address the emerging challenges of deep natural language understanding. It is built upon the existing work on relation extraction using the Hidden Vector State (HVS) model. The HVS model belongs to the category of statistical learning methods. It can be trained directly from un-annotated data in a constrained way whilst at the same time being able to capture the underlying named entity relationships. However, it is difficult to incorporate background knowledge or non-local information into the HVS model. This paper proposes to represent the HVS model as a conditionally trained undirected graphical model in which non-local features derived from PPI ontology through inference would be easily incorporated. The seamless fusion of ontology inference with statistical learning produces a new paradigm to information extraction.
Resumo:
To capture the genomic profiles for histone modification, chromatin immunoprecipitation (ChIP) is combined with next generation sequencing, which is called ChIP-seq. However, enriched regions generated from the ChIP-seq data are only evaluated on the limited knowledge acquired from manually examining the relevant biological literature. This paper proposes a novel framework, which integrates multiple knowledge sources such as biological literature, Gene Ontology, and microarray data. In order to precisely analyze ChIP-seq data for histone modification, knowledge integration is based on a unified probabilistic model. The model is employed to re-rank the enriched regions generated from peak finding algorithms. Through filtering the reranked enriched regions using some predefined threshold, more reliable and precise results could be generated. The combination of the multiple knowledge sources with the peaking finding algorithm produces a new paradigm for ChIP-seq data analysis. © (2012) Trans Tech Publications, Switzerland.
Resumo:
New media technologies, the digitisation of information, learning archives and heritage resources are changing the nature of the public library and museums services across the globe, and, in so doing, changing the way present and future users of these services interact with these institutions in real and virtual spaces. New digital technologies are rewriting the nature of participation, learning and engagement with the public library, and fashioning a new paradigm where virtual and physical spaces and educative and temporal environments operate symbiotically. It is with such a creatively disruptive paradigm that the £193 million Library of Birmingham project in the United Kingdom is being developed. New and old media forms and platforms are helping to fashion new public places and spaces that reaffirm the importance of public libraries as conceived in the nineteenth century. As people’s universities, the public library service offers a web of connective learning opportunities and affordances. This article considers the importance of community libraries as sites of intercultural understanding and practical social democracy. Their significance is reaffirmed through the initial findings in the first of a series of community interventions forming part of a long-term project, ‘Connecting Spaces and Places’, funded by the Royal Society of Arts.
Resumo:
Recent technological advances have paved the way for developing and offering advanced services for the stakeholders in the agricultural sector. A paradigm shift is underway from proprietary and monolithic tools to Internet-based, cloud hosted, open systems that will enable more effective collaboration between stakeholders. This new paradigm includes the technological support of application developers to create specialized services that will seamlessly interoperate, thus creating a sophisticated and customisable working environment for the end users. We present the implementation of an open architecture that instantiates such an approach, based on a set of domain independent software tools called "generic enablers" that have been developed in the context of the FI-WARE project. The implementation is used to validate a number of innovative concepts for the agricultural sector such as the notion of a services' market place and the system's adaptation to network failures. During the design and implementation phase, the system has been evaluated by end users, offering us valuable feedback. The results of the evaluation process validate the acceptance of such a system and the need of farmers to have access to sophisticated services at affordable prices. A summary of this evaluation process is also presented in this paper. © 2013 Elsevier B.V.
Resumo:
This paper focuses upon the argument that the role played by the engineering profession within today's society has changed markedly over the past several years from providing the foundations for contemporary life to leading societal change and becoming one of the key driver's of future social development. Coining the term 'Engineering-Sociology' this paper contributes to engineering education and engineering education research by proposing a new paradigm upon which future engineering education programmes and engineering education research might build. Developed out of an approach to learning and teaching practice, Engineering-Sociology encapsulates both traditional and applied approaches to engineering education and engineering education research. It suggests that in order to meet future challenges there is a need to bring together what are generally perceived to be two diametrically opposed paradigms, namely engineering and sociology. Building on contemporary theoretical and pedagogical arguments in engineering education research, the paper concludes that by encouraging engineering educators to 'think differently', Engineering-Sociology can provide an approach to learning and teaching that both enhances the student experience and meets the changing needs of society.
Resumo:
Sex and the City has been the subject of close scrutiny within feminist scholarship in terms of whether it is considered to be a reactionary or progressive text. While this debate is valuable within a modernist feminist paradigm, it makes less sense from a post-modernist feminist perspective. Alternately using semiotic and feminist post-structuralist methods of textual analysis, this paper shows that Sex and the City can be viewed as reactionary according to a modernist reading, but is altogether more challenging and complex according to a post-modernist reading.
Resumo:
This thesis presents a study of the sources of new product ideas and the development of new product proposals in an organisation in the UK Computer Industry. The thesis extends the work of von Hippel by showing how the phenomenon which he describes as "the Customer Active Paradigm for new product idea generation" can be observed to operate in this Industry. Furthermore, this thesis contrasts his Customer Active Paradigm with the more usually encountered Manufacturer Active Paradigm. In a second area, the thesis draws a number of conclusions relating to methods of market research, confirming existing observations and demonstrating the suitability of flexible interview strategies in certain circumstances. The thesis goes on to demonstrate the importance of free information flow within the organisation, making it more likely that sought and unsought opportunities can be exploited. It is shown that formal information flows and documents are a necessary but not sufficient means of influencing the formation of the organisation's dominant ideas on new product areas. The findings also link the work of Tushman and Katz on the role of "Gatekeepers" with the work of von Hippel by showing that the role of gatekeeper is particularly appropriate and useful to an organisation changing from Customer Active to Manufacturer Active methods of idea generation. Finally, the thesis provides conclusions relating to the exploitation of specific new product opportunities facing the sponsoring organisation.
Resumo:
This paper looks at the way in which, over recent years, paradigms for manufacturing management have evolved as a result of changing economic and environmental circumstances. The lean production concept, devised during the 1980s, proved robust only until the end of the bubble economy in Japan caused firms to re-examine the underlying principles of the lean production paradigm and redesign their production systems to suit the changing circumstances they were facing. Since that time a plethora of new concepts have emerged, most of which have been based on improving the way that firms are able to respond to the uncertainties of the new environment in which they have found themselves operating. The main question today is whether firms should be agile or adaptable. Both concepts imply a measure of responsiveness, but recent changes in the nature of the uncertainties have heightened the debate about what strategies should be adopted in the future.
Resumo:
The enterprise management (EM) approach provides a holistic view of organizations and their related information systems. In order to align information technology (IT) innovation with global markets and volatile virtualization, traditional firms are seeking to reconstruct their enterprise structures alongside repositioning strategy and establish new information system (IS) architectures to transform from single autonomous entities into more open enterprises supported by new Enterprise Resource Planning (ERP) systems. This chapter shows how ERP engage-abilities cater to three distinctive EM patterns and resultant strategies. The purpose is to examine the presumptions and importance of combing ERP and inter-firm relations relying on the virtual value chain concept. From a review of the literature on ERP development and enterprise strategy, exploratory inductive research studies in Zoomlion and Lanye have been conducted. In addition, the authors propose a dynamic conceptual framework to demonstrate the adoption and governance of ERP in the three enterprise management forms and points to a new architectural type (ERPIII) for operating in the virtual enterprise paradigm. © 2012, IGI Global.
Resumo:
The enterprise management (EM) approach provides a holistic view of organizations and their related information systems. In order to align information technology (IT) innovation with global markets and volatile virtualization, traditional firms are seeking to reconstruct their enterprise structures alongside repositioning strategy and establish new information system (IS) architectures to transform from single autonomous entities into more open enterprises supported by new Enterprise Resource Planning (ERP) systems. This chapter shows how ERP engage-abilities cater to three distinctive EM patterns and resultant strategies. The purpose is to examine the presumptions and importance of combing ERP and inter-firm relations relying on the virtual value chain concept. From a review of the literature on ERP development and enterprise strategy, exploratory inductive research studies in Zoomlion and Lanye have been conducted. In addition, the authors propose a dynamic conceptual framework to demonstrate the adoption and governance of ERP in the three enterprise management forms and points to a new architectural type (ERPIII) for operating in the virtual enterprise paradigm.