827 resultados para SUPPLY AND INFORMATION NETWORKS
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The supply chain management (SCM) concept has become embedded in the thinking of many organisations in recent years. Originally introduced by management consultants in the early 1980s, SCM has a strong focus on integration of processes across functions within firms, as well as between the organisations that comprise the wider extended enterprise. There is a significant body of research to support the notion that the consistent delivery of value to customers is predicated on higher levels of intra-firm and inter-firm integration. Putting the supply chain integration (SCI) concept into practice is critically dependent on the ability of firms to manage material, money and information flows in a holistic manner. It also depends on the way in which relationships between key supply chain actors are managed. This article explores the “mega-trends” that are evident across most sectors and which have a potentially significant impact on the ability of organisations to put SCM theory into practice. The late Don Bowersox and his colleagues from Michigan State University introduced the idea of supply chain “mega-trends” over a decade ago in their widely cited article in the Journal of Business Logistics (Bowersox et al., 2000). This article explores the current status of these “mega-trends” in an Irish context based on research being undertaken at the National Institute for Transport and Logistics (NITL). It also identifies some key factors that are likely to impact upon progress in these key areas in the medium term.
Resumo:
This paper discusses demand and supply chain management and examines how artificial intelligence techniques and RFID technology can enhance the responsiveness of the logistics workflow. This proposed system is expected to have a significant impact on the performance of logistics networks by virtue of its capabilities to adapt unexpected supply and demand changes in the volatile marketplace with the unique feature of responsiveness with the advanced technology, Radio Frequency Identification (RFID). Recent studies have found that RFID and artificial intelligence techniques drive the development of total solution in logistics industry. Apart from tracking the movement of the goods, RFID is able to play an important role to reflect the inventory level of various distribution areas. In today’s globalized industrial environment, the physical logistics operations and the associated flow of information are the essential elements for companies to realize an efficient logistics workflow scenario. Basically, a flexible logistics workflow, which is characterized by its fast responsiveness in dealing with customer requirements through the integration of various value chain activities, is fundamental to leverage business performance of enterprises. The significance of this research is the demonstration of the synergy of using a combination of advanced technologies to form an integrated system that helps achieve lean and agile logistics workflow.
Resumo:
Business angels are natural persons who provide equity financing for young enterprises and gain ownership in them. They are usually anonym investors and they operate in the background of the companies. Their important feature is that over the funding of the enterprises based on their business experiences they can contribute to the success of the companies with their special expertise and with strategic support. As a result of the asymmetric information between the angels and the companies their matching is difficult (Becsky-Nagy – Fazekas 2015), and the fact, that angel investors prefer anonymity makes it harder for entrepreneurs to obtain informal venture capital. The primary aim of the different type of business angel organizations and networks is to alleviate this matching process with intermediation between the two parties. The role of these organizations is increasing in the informal venture capital market compared to the individually operating angels. The recognition of their economic importance led many governments to support them. There were also public initiations that aimed the establishment of these intermediary organizations that led to the institutionalization of business angels. This study via the characterization of business angels focuses on the progress of these informational intermediaries and their ways of development with regards to the international trends and the current situation of Hungarian business angels and angel networks.
Resumo:
The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
In outsourcing relationships with China, the Electronic Manufacturing (EM) and Information Technology Services (ITS) industry in Taiwan may possess such advantages as the continuing growth of its production value, complete manufacturing supply chain, low production cost and a large-scale Chinese market, and language and culture similarity compared to outsourcing to other countries. Nevertheless, the Council for Economic Planning and Development of Executive Yuan (CEPD) found that Taiwan's IT services outsourcing to China is subject to certain constraints and might not be as successful as the EM outsourcing (Aggarwal, 2003; CEPD, 2004a; CIER, 2003; Einhorn and Kriplani, 2003; Kumar and Zhu, 2006; Li and Gao, 2003; MIC, 2006). Some studies examined this issue, but failed to (1) provide statistical evidence about lower prevalence rates of IT services outsourcing, and (2) clearly explain the lower prevalence rates of IT services outsourcing by identifying similarities and differences between both types of outsourcing contexts. This research seeks to fill that gap and possibly provide potential strategic guidelines to ITS firms in Taiwan. This study adopts Transaction Cost Economics (TCE) as the theoretical basis. The basic premise is that different types of outsourcing activities may incur differing transaction costs and realize varying degrees of outsourcing success due to differential attributes of the transactions in the outsourcing process. Using primary data gathered from questionnaire surveys of ninety two firms, the results from exploratory analysis and binary logistic regression indicated that (1) when outsourcing to China, Taiwanese firms' ITS outsourcing tends to have higher level of asset specificity, uncertainty and technical skills relative to EM outsourcing, and these features indirectly reduce firms' outsourcing prevalence rates via their direct positive impacts on transaction costs; (2) Taiwanese firms' ITS outsourcing tends to have lower level of transaction structurability relative to EM outsourcing, and this feature indirectly increases firms' outsourcing prevalence rates via its direct negative impacts on transaction costs; (3) frequency does influence firms' transaction costs in ITS outsourcing positively, but does not bring impacts into their outsourcing prevalence rates, (4) relatedness does influence firms' transaction costs positively and prevalence rates negatively in ITS outsourcing, but its impacts on the prevalence rates are not caused by the mediation effects of transaction costs, and (5) firm size of outsourcing provider does not affect firms' transaction costs, but does affect their outsourcing prevalence rates in ITS outsourcing directly and positively. Using primary data gathered from face-to-face interviews of executives from seven firms, the results from inductive analysis indicated that (1) IT services outsourcing has lower prevalence rates than EM outsourcing, and (2) this result is mainly attributed to Taiwan's core competence in manufacturing and management and higher overall transaction costs of IT services outsourcing. Specifically, there is not much difference between both types of outsourcing context in the transaction characteristics of reputation and most aspects of overall comparison. Although there are some differences in the feature of firm size of the outsourcing provider, the difference doesn't cause apparent impacts on firms' overall transaction costs. The medium or above medium difference in the transaction characteristics of asset specificity, uncertainty, frequency, technical skills, transaction structurability, and relatedness has caused higher overall transaction costs for IT services outsourcing. This higher cost might cause lower prevalence rates for ITS outsourcing relative to EM outsourcing. Overall, the interview results are consistent with the statistical analyses and provide support to my expectation that in outsourcing to China, Taiwan's electronic manufacturing firms do have lower prevalence rates of IT services outsourcing relative to EM outsourcing due to higher transaction costs caused by certain attributes. To solve this problem, firms' management should aim at identifying alternative strategies and strive to reduce their overall transaction costs of IT services outsourcing by initiating appropriate strategies which fit their environment and needs.
Resumo:
The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
Water systems in the Sultanate of Oman are inevitably exposed to varied threats and hazards due to both natural and man-made hazards. Natural disasters, especially tropical cyclone Gonu in 2007, cause immense damage to water supply systems in Oman. At the same time water loss from leaks is a major operational problem. This research developed an integrated approach to identify and rank the risks to the water sources, transmission pipelines and distribution networks in Oman and suggests appropriate mitigation measures. The system resilience was evaluated and an emergency response plan for the water supplies developed. The methodology involved mining the data held by the water supply utility for risk and resilience determination and operational data to support calculations of non-revenue water. Risk factors were identified, ranked and scored at a stakeholder workshop and the operational information required was principally gathered from interviews. Finally, an emergency response plan was developed by evaluating the risk and resilience factors. The risk analysis and assessment used a Coarse Risk Analysis (CRA) approach and risk scores were generated using a simple risk matrix based on WHO recommendations. The likelihoods and consequences of a wide range of hazardous events were identified through a key workshop and subsequent questionnaires. The thesis proposes a method of translating the detailed risk evaluations into resilience scores through a methodology used in transportation networks. A water audit indicated that the percentage of NRW in Oman is greater than 35% which is similar to other Gulf countries but high internationally. The principal strategy for managing NRW used in the research was the AWWA water audit method which includes free to use software and was found to be easy to apply in Oman. The research showed that risks to the main desalination processes can be controlled but the risk due to feed water quality might remain high even after implementing mitigation measures because the intake is close to an oil port with a significant risk of oil contamination and algal blooms. The most severe risks to transmission mains were found to be associated with pipe rather than pump failure. The systems in Oman were found to be moderately resilient, the resilience of desalination plants reasonably high but the transmission mains and pumping stations are very vulnerable. The integrated strategy developed in this study has a wide applicability, particularly in the Gulf area, which may have risks from exceptional events and will be experiencing NRW. Other developing countries may also experience such risks but with different magnitudes and the risk evaluation tables could provide a useful format for further work.
Resumo:
The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).
Resumo:
The paper presents a critical analysis of the extant literature pertaining to the networking behaviours of young jobseekers in both offline and online environments. A framework derived from information behaviour theory is proposed as a basis for conducting further research in this area. Method. Relevant material for the review was sourced from key research domains such as library and information science, job search research, and organisational research. Analysis. Three key research themes emerged from the analysis of the literature: (1) social networks, and the use of informal channels of information during job search, (2) the role of networking behaviours in job search, and (3) the adoption of social media tools. Tom Wilson’s general model of information behaviour was also identified as a suitable framework to conduct further research. Results. Social networks have a crucial informational utility during the job search process. However, the processes whereby young jobseekers engage in networking behaviours, both offline and online, remain largely unexplored. Conclusion. Identification and analysis of the key research themes reveal opportunities to acquire further knowledge regarding the networking behaviours of young jobseekers. Wilson’s model can be used as a framework to provide a holistic understanding of the networking process, from an information behaviour perspective.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.
Resumo:
Paper prepared by Marion Panizzon and Charlotte Sieber-Gasser for the International Conference on the Political Economy of Liberalising Trade in Services, Hebrew University of Jerusalem, 14-15 June 2010 Recent literature has shed light on the economic potential of cross-border networks. These networks, consisting of expatriates and their acquaintances from abroad and at home, provide the basis for the creation of cross-border value added chains and therewith the means for turning brain drain into brain circulation. Both aspects are potentially valuable for economic growth in the developing world. Unilateral co-development policies operating through co-funding of expatriate business ventures, but also bilateral agreements liberalising circular migration for a limited set of per-sons testify to the increasing awareness of governments about the potential, which expatriate networks hold for economic growth in developing countries. Whereas such punctual efforts are valuable, viewed from a long term perspective, these top-down, government mandated Diaspora stimulation programs, will not replace, this paper argues, the market-driven liberalisation of infrastructure and other services in developing countries. Nor will they carry, in the case of circular labour migration, the political momentum to liberalise labour market admission for those non-nationals, who will eventually emerge as the future transnational entrepreneurs. It will take a combination of mode 4 and infrastructure services openings-cum regulation for countries at both sides of the spectrum to provide the basis and precondition for transnational business and entrepreneurial networks to emerge and translate into cross-border, value added production chains. Two key issues are of particular relevance in this context: (i) the services sector, especially in infrastructure, tends to suffer from inefficiencies, particularly in developing countries, and (ii) labour migration, a highly complex issue, still faces disproportionately rigid barriers despite well-documented global welfare gains. Both are hindrances for emerging markets to fully take advantage of the potential of these cross-border networks. Adapting the legal framework for enhancing the regulatory and institutional frameworks for services trade, especially in infrastructure services sectors (ISS) and labour migration could provide the incentives necessary for brain circulation and strengthen cross-border value added chains by lowering transaction costs. This paper analyses the shortfalls of the global legal framework – the shallow status quo of GATS commitments in ISS and mode 4 particular – in relation to stimulating brain circulation and the creation of cross-border value added chains in emerging markets. It highlights the necessity of adapting the legal framework, both on the global and the regional level, to stimulate broader and wider market access in the four key ISS sectors (telecommunications, transport, professional and financial services) in developing countries, as domestic supply capacity, global competitiveness and economic diversification in ISS sectors are necessary for mobilising expatriate re-turns, both physical and virtual. The paper argues that industrialised, labour receiving countries need to offer mode 4 market access to wider categories of persons, especially to students, graduate trainees and young professionals from abroad. Further-more, free trade in semi-finished products and mode 4 market access are crucial for the creation of cross-border value added chains across the developing world. Finally, the paper discusses on the basis of a case study on Jordan why the key features of trade agreements, which promote circular migration and the creation of cross-border value added chains, consist of trade liberalisation in services and liberal migration policies.