954 resultados para Integrated process
Resumo:
This Master’s thesis examines the implementation of management system standard requirements as integrated in the organization. The aim is to determine how requirements from management system standards ISO 14001:2015 and ISO 9001:2015 can be integrated and implemented into the existing ISO 9001:2008 compliant management system. Research was executed as action research by utilizing an operating model about the integrated use of management system standards created by the International Organization for Standardization. Phases of the operating model were applied to the target organization. The similarity and integration potential of relevant standards were assessed by using comparative matrices. Allocation of the requirements and conformity assessment of the processes was executed by gap analysis. The main results indicate that the requirements of the relevant standards are principally equivalent or have the same kind of purpose. The results also show the most important processes of the target organization in terms of requirement compliance, as well as the requirements which affect the process the most. Prioritizing the compliance achievement of the most important processes and implementation of those requirements that have the most effect create an opportunity for organizations to implement the integrated requirements effectively.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Mobile network coverage is traditionally provided by outdoor macro base stations, which have a long range and serve several of customers. Due to modern passive houses and tightening construction legislation, mobile network service is deteriorated in many indoor locations. Typically, solutions for indoor coverage problem are expensive and demand actions from the mobile operator. Due to these, superior solutions are constantly researched. The solution presented in this thesis is based on Small Cell technology. Small Cells are low power access nodes designed to provide voice and data services.. This thesis concentrates on a specific Small Cell solution, which is called a Pico Cell. The problem regarding Pico Cells and Small Cells in general is that they are a new technological solution for the mobile operator, and the possible problem sources and incidents are not properly mapped. The purpose of this thesis is to figure out the possible problems in the Pico Cell deployment and how they could be solved within the operator’s incident management process. The research in the thesis is carried out with a literature research and a case study. The possible problems are investigated through lab testing. Pico Cell automated deployment process was tested in the lab environment and its proper functionality is confirmed. The related network elements were also tested and examined, and the emerged problems are resolvable. Operators existing incident management process can be used for Pico Cell troubleshooting with minor updates. Certain pre-requirements have to be met before Pico Cell deployment can be considered. The main contribution of this thesis is the Pico Cell integrated incident management process. The presented solution works in theory and solves the problems found during the lab testing. The limitations in the customer service level were solved by adding the necessary tools and by designing a working question pattern. Process structures for automated network discovery and pico specific radio parameter planning were also added for the mobile network management layer..
Resumo:
Contemporary integrated circuits are designed and manufactured in a globalized environment leading to concerns of piracy, overproduction and counterfeiting. One class of techniques to combat these threats is circuit obfuscation which seeks to modify the gate-level (or structural) description of a circuit without affecting its functionality in order to increase the complexity and cost of reverse engineering. Most of the existing circuit obfuscation methods are based on the insertion of additional logic (called “key gates”) or camouflaging existing gates in order to make it difficult for a malicious user to get the complete layout information without extensive computations to determine key-gate values. However, when the netlist or the circuit layout, although camouflaged, is available to the attacker, he/she can use advanced logic analysis and circuit simulation tools and Boolean SAT solvers to reveal the unknown gate-level information without exhaustively trying all the input vectors, thus bringing down the complexity of reverse engineering. To counter this problem, some ‘provably secure’ logic encryption algorithms that emphasize methodical selection of camouflaged gates have been proposed previously in literature [1,2,3]. The contribution of this paper is the creation and simulation of a new layout obfuscation method that uses don't care conditions. We also present proof-of-concept of a new functional or logic obfuscation technique that not only conceals, but modifies the circuit functionality in addition to the gate-level description, and can be implemented automatically during the design process. Our layout obfuscation technique utilizes don’t care conditions (namely, Observability and Satisfiability Don’t Cares) inherent in the circuit to camouflage selected gates and modify sub-circuit functionality while meeting the overall circuit specification. Here, camouflaging or obfuscating a gate means replacing the candidate gate by a 4X1 Multiplexer which can be configured to perform all possible 2-input/ 1-output functions as proposed by Bao et al. [4]. It is important to emphasize that our approach not only obfuscates but alters sub-circuit level functionality in an attempt to make IP piracy difficult. The choice of gates to obfuscate determines the effort required to reverse engineer or brute force the design. As such, we propose a method of camouflaged gate selection based on the intersection of output logic cones. By choosing these candidate gates methodically, the complexity of reverse engineering can be made exponential, thus making it computationally very expensive to determine the true circuit functionality. We propose several heuristic algorithms to maximize the RE complexity based on don’t care based obfuscation and methodical gate selection. Thus, the goal of protecting the design IP from malicious end-users is achieved. It also makes it significantly harder for rogue elements in the supply chain to use, copy or replicate the same design with a different logic. We analyze the reverse engineering complexity by applying our obfuscation algorithm on ISCAS-85 benchmarks. Our experimental results indicate that significant reverse engineering complexity can be achieved at minimal design overhead (average area overhead for the proposed layout obfuscation methods is 5.51% and average delay overhead is about 7.732%). We discuss the strengths and limitations of our approach and suggest directions that may lead to improved logic encryption algorithms in the future. References: [1] R. Chakraborty and S. Bhunia, “HARPOON: An Obfuscation-Based SoC Design Methodology for Hardware Protection,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 28, no. 10, pp. 1493–1502, 2009. [2] J. A. Roy, F. Koushanfar, and I. L. Markov, “EPIC: Ending Piracy of Integrated Circuits,” in 2008 Design, Automation and Test in Europe, 2008, pp. 1069–1074. [3] J. Rajendran, M. Sam, O. Sinanoglu, and R. Karri, “Security Analysis of Integrated Circuit Camouflaging,” ACM Conference on Computer Communications and Security, 2013. [4] Bao Liu, Wang, B., "Embedded reconfigurable logic for ASIC design obfuscation against supply chain attacks,"Design, Automation and Test in Europe Conference and Exhibition (DATE), 2014 , vol., no., pp.1,6, 24-28 March 2014.
Resumo:
Obviously, it is important for the mini-enterprise to acknowledgement that how to win the customers and markets, because the products must be continuously evolved so as to satisfy the customer, otherwise it will be disused by the market, that is a major problems for nowadays mini-enterprise business process management. In fact, in order to satisfy the customers, the overall business processes for mini-enterprises are mostly based on integrated business process, optimization on the integrated business process is vital for a successful min-enterprise. this paper explores how to optimize the business process of mini-enterprises based on the general principle of enterprise business process management and the main feature of the mini-enterprise, so as to instruct the mini-enterprise to control, enhance and optimize the business process in order to meet the inner requirements from the development of the enterprise and adapt itself with the continuous changes of the outside environment, most vitally it can enhance the process or re-design the process so as to meet business demands from customers.
Resumo:
Single-cell oils (SCO) have been considered a promising source of 3rd generation biofuels mainly in the final form of biodiesel. However, its high production costs have been a barrier towards the commercialization of this commodity. The fast growing yeast Rhodosporidium toruloides NCYC 921 has been widely reported as a potential SCO producing yeast. In addition to its well-known high lipid content (that can be converted into biodiesel), is rich in high value added products such as carotenoids with commercial interest. The process design and integration may contribute to reduce the overall cost of biofuels and carotenoid production and is a mandatory step towards their commercialization. The present work addresses the biomass disruption, extraction, fractionation and recovery of products with special emphasis on high added valued carotenoids (beta-carotene, torulene, torularhodin) and fatty acids directed to biodiesel. The chemical structure of torularhodin with a terminal carboxylic group imposes an additional extra challenge in what concern its separation from fatty acids. The proposed feedstock is fresh biomass pellet obtained directly by centrifugation from a 5L fed-batch fermentation culture broth. The use of a wet instead of lyophilised biomass feedstock is a way to decrease processing energy costs and reduce downstream processing time. These results will contribute for a detailed process design. Gathered data will be of crucial importance for a further study on Life-Cycle Assessment (LCA).
Resumo:
This research examines the process of placemaking in LeDroit Park, a residential Washington, DC, neighborhood with a historic district at its core. Unpacking the entwined physical and social evolution of the small community within the context of the Nation’s Capital, this analysis provides insight into the role of urban design and development as well as historic designation on shaping collective identity. Initially planned and designed in 1873 as a gated suburb just beyond the formal L’Enfant-designed city boundary, LeDroit Park was intended as a retreat for middle and upper-class European Americans from the growing density and social diversity of the city. With a mixture of large romantic revival mansions and smaller frame cottages set on grassy plots evocative of an idealized rural village, the physical design was intentionally inwardly-focused. This feeling of refuge was underscored with a physical fence that surrounded the development, intended to prevent African Americans from nearby Howard University and the surrounding neighborhood, from using the community’s private streets to access the City of Washington. Within two decades of its founding, LeDroit Park was incorporated into the District of Columbia, the surrounding fence was demolished, and the neighborhood was racially integrated. Due to increasingly stringent segregation laws and customs in the city, this period of integration lasted less than twenty years, and LeDroit Park developed into an elite African American enclave, using the urban design as a bulwark against the indignities of a segregated city. Throughout the 20th century housing infill and construction increased density, yet the neighborhood never lost the feeling of security derived from the neighborhood plan. Highlighting the architecture and street design, neighbors successfully received historic district designation in 1974 in order to halt campus expansion. After a stalemate that lasted two decades, the neighborhood began another period of transformation, both racial and socio-economic, catalyzed by a multi-pronged investment program led by Howard University. Through interviews with long-term and new community members, this investigation asserts that the 140-year development history, including recent physical interventions, is integral to placemaking, shaping the material character as well as the social identity of residents.
Resumo:
Many maritime countries in Europe have implemented marine environmental monitoring programmes which include the measurement of chemical contaminants and related biological effects. How best to integrate data obtained in these two types of monitoring into meaningful assessments has been the subject of recent efforts by the International Council for Exploration of the Sea (ICES) Expert Groups. Work within these groups has concentrated on defining a core set of chemical and biological endpoints that can be used across maritime areas, defining confounding factors, supporting parameters and protocols for measurement. The framework comprised markers for concentrations of, exposure to and effects from, contaminants. Most importantly, assessment criteria for biological effect measurements have been set and the framework suggests how these measurements can be used in an integrated manner alongside contaminant measurements in biota, sediments and potentially water. Output from this process resulted in OSPAR Commission (www.ospar.org) guidelines that were adopted in 2012 on a trial basis for a period of 3 years. The developed assessment framework can furthermore provide a suitable approach for the assessment of Good Environmental Status (GES) for Descriptor 8 of the European Union (EU) Marine Strategy Framework Directive (MSFD).
Resumo:
This thesis proposes a generic visual perception architecture for robotic clothes perception and manipulation. This proposed architecture is fully integrated with a stereo vision system and a dual-arm robot and is able to perform a number of autonomous laundering tasks. Clothes perception and manipulation is a novel research topic in robotics and has experienced rapid development in recent years. Compared to the task of perceiving and manipulating rigid objects, clothes perception and manipulation poses a greater challenge. This can be attributed to two reasons: firstly, deformable clothing requires precise (high-acuity) visual perception and dexterous manipulation; secondly, as clothing approximates a non-rigid 2-manifold in 3-space, that can adopt a quasi-infinite configuration space, the potential variability in the appearance of clothing items makes them difficult to understand, identify uniquely, and interact with by machine. From an applications perspective, and as part of EU CloPeMa project, the integrated visual perception architecture refines a pre-existing clothing manipulation pipeline by completing pre-wash clothes (category) sorting (using single-shot or interactive perception for garment categorisation and manipulation) and post-wash dual-arm flattening. To the best of the author’s knowledge, as investigated in this thesis, the autonomous clothing perception and manipulation solutions presented here were first proposed and reported by the author. All of the reported robot demonstrations in this work follow a perception-manipulation method- ology where visual and tactile feedback (in the form of surface wrinkledness captured by the high accuracy depth sensor i.e. CloPeMa stereo head or the predictive confidence modelled by Gaussian Processing) serve as the halting criteria in the flattening and sorting tasks, respectively. From scientific perspective, the proposed visual perception architecture addresses the above challenges by parsing and grouping 3D clothing configurations hierarchically from low-level curvatures, through mid-level surface shape representations (providing topological descriptions and 3D texture representations), to high-level semantic structures and statistical descriptions. A range of visual features such as Shape Index, Surface Topologies Analysis and Local Binary Patterns have been adapted within this work to parse clothing surfaces and textures and several novel features have been devised, including B-Spline Patches with Locality-Constrained Linear coding, and Topology Spatial Distance to describe and quantify generic landmarks (wrinkles and folds). The essence of this proposed architecture comprises 3D generic surface parsing and interpretation, which is critical to underpinning a number of laundering tasks and has the potential to be extended to other rigid and non-rigid object perception and manipulation tasks. The experimental results presented in this thesis demonstrate that: firstly, the proposed grasp- ing approach achieves on-average 84.7% accuracy; secondly, the proposed flattening approach is able to flatten towels, t-shirts and pants (shorts) within 9 iterations on-average; thirdly, the proposed clothes recognition pipeline can recognise clothes categories from highly wrinkled configurations and advances the state-of-the-art by 36% in terms of classification accuracy, achieving an 83.2% true-positive classification rate when discriminating between five categories of clothes; finally the Gaussian Process based interactive perception approach exhibits a substantial improvement over single-shot perception. Accordingly, this thesis has advanced the state-of-the-art of robot clothes perception and manipulation.
Resumo:
Part 4: Transition Towards Product-Service Systems
Resumo:
This chapter offers a framework for combining critical language policy with critical discourse studies (CDS) to analyse language policy as a process in the context of minority language policy in Wales. I propose a discursive approach to language policy, which starts from the premise that language policy is constituted, enacted, interpreted and (re)contextualised in and through language. This approach extends the critical language policy framework provided by Shohamy (Language policy: hidden agendas and new approaches. Routledge, London, 2006) and integrates perspectives from the context-sensitive discourse-historical approach in CDS. It incorporates discourse as an essential lens through which policy mechanisms, ideologies and practices are constituted and de facto language policy materialises. This chapter argues that conceptualising and analysing language policy as a discursive phenomenon enables a better understanding of the multi-layered nature of language policy that shapes the management and experience of corporate bilingualism in Wales.
Resumo:
Efficient numerical models facilitate the study and design of solid oxide fuel cells (SOFCs), stacks, and systems. Whilst the accuracy and reliability of the computed results are usually sought by researchers, the corresponding modelling complexities could result in practical difficulties regarding the implementation flexibility and computational costs. The main objective of this article is to adapt a simple but viable numerical tool for evaluation of our experimental rig. Accordingly, a model for a multi-layer SOFC surrounded by a constant temperature furnace is presented, trained and validated against experimental data. The model consists of a four-layer structure including stand, two interconnects, and PEN (Positive electrode-Electrolyte-Negative electrode); each being approximated by a lumped parameter model. The heating process through the surrounding chamber is also considered. We used a set of V-I characteristics data for parameter adjustment followed by model verification against two independent sets of data. The model results show a good agreement with practical data, offering a significant improvement compared to reduced models in which the impact of external heat loss is neglected. Furthermore, thermal analysis for adiabatic and non-adiabatic process is carried out to capture the thermal behaviour of a single cell followed by a polarisation loss assessment. Finally, model-based design of experiment is demonstrated for a case study.
Resumo:
The literature clearly links the quality and capacity of a country’s infrastructure to its economic growth and competitiveness. This thesis analyses the historic national and spatial distribution of investment by the Irish state in its physical networks (water, wastewater and roads) across the 34 local authorities and examines how Ireland is perceived internationally relative to its economic counterparts. An appraisal of the current status and shortcomings of Ireland’s infrastructure is undertaken using key stakeholders from foreign direct investment companies and national policymakers to identify Ireland's infrastructural gaps, along with current challenges in how the country is delivering infrastructure. The output of these interviews identified many issues with how infrastructure decision-making is currently undertaken. This led to an evaluation of how other countries are informing decision-making, and thus this thesis presents a framework of how and why Ireland should embrace a Systems of Systems (SoS) methodology approach to infrastructure decision-making going forward. In undertaking this study a number of other infrastructure challenges were identified: significant political interference in infrastructure decision-making and delivery the need for a national agency to remove the existing ‘silo’ type of mentality to infrastructure delivery how tax incentives can interfere with the market; and their significance. The two key infrastructure gaps identified during the interview process were: the need for government intervention in the rollout of sufficient communication capacity and at a competitive cost outside of Dublin; and the urgent need to address water quality and capacity with approximately 25% of the population currently being served by water of unacceptable quality. Despite considerable investment in its national infrastructure, Ireland’s infrastructure performance continues to trail behind its economic partners in the Eurozone and OECD. Ireland is projected to have the highest growth rate in the euro zone region in 2015 and 2016, albeit that it required a bailout in 2010, and, at the time of writing, is beginning to invest in its infrastructure networks again. This thesis proposes the development and implementation of a SoS approach for infrastructure decision-making which would be based on: existing spatial and capacity data of each of the constituent infrastructure networks; and scenario computation and analysis of alternative drivers eg. Demographic change, economic variability and demand/capacity constraints. The output from such an analysis would provide valuable evidence upon which policy makers and decision makers alike could rely, which has been lacking in historic investment decisions.
Resumo:
O CERN - a Organização Europeia para a Investigação Nuclear - é um dos maiores centros de investigação a nível mundial, responsável por diversas descobertas na área da física bem como na área das ciências da computação. O CERN Document Server, também conhecido como CDS Invenio, é um software desenvolvido no CERN, que tem como objectivo fornecer um conjunto de ferramentas para gerir bibliotecas digitais. A fim de melhorar as funcionalidades do CDS Invenio foi criado um novo módulo, chamado BibCirculation, para gerir os livros (e outros itens) da biblioteca do CERN, funcionando como um sistema integrado de gestão de bibliotecas. Esta tese descreve os passos que foram dados para atingir os vários objectivos deste projecto, explicando, entre outros, o processo de integração com os outros módulos existentes bem como a forma encontrada para associar informações dos livros com os metadados do CDS lnvenio. É também possível encontrar uma apresentação detalhada sobre todo o processo de implementação e os testes realizados. Finalmente, são apresentadas as conclusões deste projecto e o trabalho a desenvolver futuramente. ABSTRACT: CERN - The European Organization for Nuclear Research - is one of the largest research centers worldwide, responsible for several discoveries in physics as well as in computer science. The CERN Document Server, also known as CDS Invenio, is a software developed at CERN, which aims to provide a set of tools for managing digital libraries. ln order to improve the functionalities of CDS Invenio a new module was developed, called BibCirculation, to manage books (and other items) from the CERN library, and working as an Integrated Library System. This thesis shows the steps that have been done to achieve the several goals of this project, explaining, among others aspects, the process of integration with other existing modules as well as the way to associate the information about books with the metadata from CDS lnvenio. You can also find detailed explanation of the entire implementation process and testing. Finally, there are presented the conclusions of this project and ideas for future development.
Resumo:
In this study, a novel hybrid thermochemical-biological refinery integrated with power-to-x approach was developed for obtaining biopolymers (namely polyhydroxyalkanoates, PHA). Within this concept, a trilogy process schema comprising of, (i) thermochemical conversion via integrated pyrolysis-gasification technologies, (ii) anaerobic fermentation of the bioavailable products obtained through either thermochemistry or water-electrolysis for volatile fatty acids (VFA) production, (iii) and VFA-to-PHA bioconversion via an original microaerophilic-aerobic process was developed. During the first stage of proposed biorefinery where lignocellulosic (wooden) biomass was converted into, theoretically fermentable products (i.e. bioavailables) which were defined as syngas and water-soluble fraction of pyrolytic liquid (WS); biochar as a biocatalyst material; and a dense-oil as a liquid fuel. Within integrated pyrolysis - gasification process, biomass was efficiently converted into fermentable intermediates representing up to 66% of biomass chemical energy content in chemical oxygen demand (COD) basis. In the secondary stage, namely anaerobic fermentation for obtaining VFA rich streams, three different downstream process were investigated. First fermentation test was acidogenic bioconversion of WS materials obtained through pyrolysis of biomass within an original biochar-packed bioreactor, it was sustained up to 0.6 gCOD/L-day volumetric productivity (VP). Second, C1 rich syngas materials as the gaseous fraction of pyrolysis-gasification stage, was fermented within a novel char-based biofilm sparger reactor (CBSR), where up to 9.8 gCOD/L-day VP was detected. Third was homoacetogenic bioconversion within the innovative power-to-x pathway for obtaining commodities via renewable energy sources. More specifically, water-electrolysis derived H2 and CO2 as a primary greenhouse gas was successfully bio-utilized by anaerobic mixed cultures into VFA within CBSR system (VP: 18.2 gCOD/L-day). In the last stage of the developed biorefinery schema, VFA is converted into biopolymers within a new continuous microaerophilic-aerobic microplant, where up to 60% of PHA containing sludges was obtained.