994 resultados para Process standardization


Relevância:

60.00% 60.00%

Publicador:

Resumo:

El actual contexto de fabricación, con incrementos en los precios de la energía, una creciente preocupación medioambiental y cambios continuos en los comportamientos de los consumidores, fomenta que los responsables prioricen la fabricación respetuosa con el medioambiente. El paradigma del Internet de las Cosas (IoT) promete incrementar la visibilidad y la atención prestada al consumo de energía gracias tanto a sensores como a medidores inteligentes en los niveles de máquina y de línea de producción. En consecuencia es posible y sencillo obtener datos de consumo de energía en tiempo real proveniente de los procesos de fabricación, pero además es posible analizarlos para incrementar su importancia en la toma de decisiones. Esta tesis pretende investigar cómo utilizar la adopción del Internet de las Cosas en el nivel de planta de producción, en procesos discretos, para incrementar la capacidad de uso de la información proveniente tanto de la energía como de la eficiencia energética. Para alcanzar este objetivo general, la investigación se ha dividido en cuatro sub-objetivos y la misma se ha desarrollado a lo largo de cuatro fases principales (en adelante estudios). El primer estudio de esta tesis, que se apoya sobre una revisión bibliográfica comprehensiva y sobre las aportaciones de expertos, define prácticas de gestión de la producción que son energéticamente eficientes y que se apoyan de un modo preeminente en la tecnología IoT. Este primer estudio también detalla los beneficios esperables al adoptar estas prácticas de gestión. Además, propugna un marco de referencia para permitir la integración de los datos que sobre el consumo energético se obtienen en el marco de las plataformas y sistemas de información de la compañía. Esto se lleva a cabo con el objetivo último de remarcar cómo estos datos pueden ser utilizados para apalancar decisiones en los niveles de procesos tanto tácticos como operativos. Segundo, considerando los precios de la energía como variables en el mercado intradiario y la disponibilidad de información detallada sobre el estado de las máquinas desde el punto de vista de consumo energético, el segundo estudio propone un modelo matemático para minimizar los costes del consumo de energía para la programación de asignaciones de una única máquina que deba atender a varios procesos de producción. Este modelo permite la toma de decisiones en el nivel de máquina para determinar los instantes de lanzamiento de cada trabajo de producción, los tiempos muertos, cuándo la máquina debe ser puesta en un estado de apagada, el momento adecuado para rearrancar, y para pararse, etc. Así, este modelo habilita al responsable de producción de implementar el esquema de producción menos costoso para cada turno de producción. En el tercer estudio esta investigación proporciona una metodología para ayudar a los responsables a implementar IoT en el nivel de los sistemas productivos. Se incluye un análisis del estado en que se encuentran los sistemas de gestión de energía y de producción en la factoría, así como también se proporcionan recomendaciones sobre procedimientos para implementar IoT para capturar y analizar los datos de consumo. Esta metodología ha sido validada en un estudio piloto, donde algunos indicadores clave de rendimiento (KPIs) han sido empleados para determinar la eficiencia energética. En el cuarto estudio el objetivo es introducir una vía para obtener visibilidad y relevancia a diferentes niveles de la energía consumida en los procesos de producción. El método propuesto permite que las factorías con procesos de producción discretos puedan determinar la energía consumida, el CO2 emitido o el coste de la energía consumida ya sea en cualquiera de los niveles: operación, producto o la orden de fabricación completa, siempre considerando las diferentes fuentes de energía y las fluctuaciones en los precios de la misma. Los resultados muestran que decisiones y prácticas de gestión para conseguir sistemas de producción energéticamente eficientes son posibles en virtud del Internet de las Cosas. También, con los resultados de esta tesis los responsables de la gestión energética en las compañías pueden plantearse una aproximación a la utilización del IoT desde un punto de vista de la obtención de beneficios, abordando aquellas prácticas de gestión energética que se encuentran más próximas al nivel de madurez de la factoría, a sus objetivos, al tipo de producción que desarrolla, etc. Así mismo esta tesis muestra que es posible obtener reducciones significativas de coste simplemente evitando los períodos de pico diario en el precio de la misma. Además la tesis permite identificar cómo el nivel de monitorización del consumo energético (es decir al nivel de máquina), el intervalo temporal, y el nivel del análisis de los datos son factores determinantes a la hora de localizar oportunidades para mejorar la eficiencia energética. Adicionalmente, la integración de datos de consumo energético en tiempo real con datos de producción (cuando existen altos niveles de estandarización en los procesos productivos y sus datos) es esencial para permitir que las factorías detallen la energía efectivamente consumida, su coste y CO2 emitido durante la producción de un producto o componente. Esto permite obtener una valiosa información a los gestores en el nivel decisor de la factoría así como a los consumidores y reguladores. ABSTRACT In today‘s manufacturing scenario, rising energy prices, increasing ecological awareness, and changing consumer behaviors are driving decision makers to prioritize green manufacturing. The Internet of Things (IoT) paradigm promises to increase the visibility and awareness of energy consumption, thanks to smart sensors and smart meters at the machine and production line level. Consequently, real-time energy consumption data from the manufacturing processes can be easily collected and then analyzed, to improve energy-aware decision-making. This thesis aims to investigate how to utilize the adoption of the Internet of Things at shop floor level to increase energy–awareness and the energy efficiency of discrete production processes. In order to achieve the main research goal, the research is divided into four sub-objectives, and is accomplished during four main phases (i.e., studies). In the first study, by relying on a comprehensive literature review and on experts‘ insights, the thesis defines energy-efficient production management practices that are enhanced and enabled by IoT technology. The first study also explains the benefits that can be obtained by adopting such management practices. Furthermore, it presents a framework to support the integration of gathered energy data into a company‘s information technology tools and platforms, which is done with the ultimate goal of highlighting how operational and tactical decision-making processes could leverage such data in order to improve energy efficiency. Considering the variable energy prices in one day, along with the availability of detailed machine status energy data, the second study proposes a mathematical model to minimize energy consumption costs for single machine production scheduling during production processes. This model works by making decisions at the machine level to determine the launch times for job processing, idle time, when the machine must be shut down, ―turning on‖ time, and ―turning off‖ time. This model enables the operations manager to implement the least expensive production schedule during a production shift. In the third study, the research provides a methodology to help managers implement the IoT at the production system level; it includes an analysis of current energy management and production systems at the factory, and recommends procedures for implementing the IoT to collect and analyze energy data. The methodology has been validated by a pilot study, where energy KPIs have been used to evaluate energy efficiency. In the fourth study, the goal is to introduce a way to achieve multi-level awareness of the energy consumed during production processes. The proposed method enables discrete factories to specify energy consumption, CO2 emissions, and the cost of the energy consumed at operation, production and order levels, while considering energy sources and fluctuations in energy prices. The results show that energy-efficient production management practices and decisions can be enhanced and enabled by the IoT. With the outcomes of the thesis, energy managers can approach the IoT adoption in a benefit-driven way, by addressing energy management practices that are close to the maturity level of the factory, target, production type, etc. The thesis also shows that significant reductions in energy costs can be achieved by avoiding high-energy price periods in a day. Furthermore, the thesis determines the level of monitoring energy consumption (i.e., machine level), the interval time, and the level of energy data analysis, which are all important factors involved in finding opportunities to improve energy efficiency. Eventually, integrating real-time energy data with production data (when there are high levels of production process standardization data) is essential to enable factories to specify the amount and cost of energy consumed, as well as the CO2 emitted while producing a product, providing valuable information to decision makers at the factory level as well as to consumers and regulators.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There has been a demand for uniform CAD standards in the construction industry ever since the large-scale introduction of computer aided design systems in the late 1980s. While some standards have been widely adopted without much formal effort, other standards have failed to gain support even though considerable resources have been allocated for the purpose. Establishing a standard concerning building information modeling has been one particularly active area of industry development and scientific interest within recent years. In this paper, four different standards are discussed as cases: the IGES and DXF/DWG standards for representing the graphics in 2D drawings, the ISO 13567 standard for the structuring of building information on layers, and the IFC standard for building product models. Based on a literature study combined with two qualitative interview studies with domain experts, a process model is proposed to describe and interpret the contrasting histories of past CAD standardisation processes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present work, Indigenous polymer coated Tin Free Steel cans were analyzed fortheir suitability for thermal processing and storage of fish and fish products following standard methods. The raw materials used for the development of ready to eat thermally processed fish products were found to be of fresh condition. The values for various biochemical and microbiological parameters of the raw materials were well within the limits. Based on the analysis of commercial sterility, instrumental colour, texture, WB-shear force and sensory parameters, squid masala processed to F0 value of 8 min with a total process time of 38.5 min and cook value of 92 min was chosen as the optimum for squid masala in tin free steel cans while shrimp curry processed to F0 7 min with total process time of 44.0 min and cook value of 91.1 min was found to be ideal and was selected for storage study. Squid masala and shrimp curry thermally processed in indigenous polymer coated TFS cans were found to be acceptable even after one year of storage at room temperaturebased on the analysis of various sensory and biochemical parameters. Analysis of the Commission Internationale d’ Eclirage L*, a* and b* color values showed that the duration of exposure to heat treatment influenced the color parameters: the lightness (L*) and yellowness (b*)decreased, and the redness (a*) significantly increased with the increase in processing time or reduction in processing temperature.Instrumental analysis of texture showed that hardness-1 & 2 decreased with reduction in retort temperature while cohesiveness value did not show any appreciable change with decrease in temperature of processing. Other texture profile parameters like gumminess, springiness and chewiness decreased significantly with increase of processing time. W-B shear force values of mackerel meat processed at 130 °C were significantly higher than those processed at 121.1 and 115 °C. HTST processing of mackerel in brine helped in reducing the process time and improving the quality.The study also indicated that indigenous polymer coated TFS cans with easy openends can be a viable alternative to the conventional tin and aluminium cans. The industry can utilize these cans for processing ready to eat fish and shell fish products for both domestic and export markets. This will help in reviving the canning industry in India.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business process modeling has undoubtedly emerged as a popular and relevant practice in Information Systems. Despite being an actively researched field, anecdotal evidence and experiences suggest that the focus of the research community is not always well aligned with the needs of industry. The main aim of this paper is, accordingly, to explore the current issues and the future challenges in business process modeling, as perceived by three key stakeholder groups (academics, practitioners, and tool vendors). We present the results of a global Delphi study with these three groups of stakeholders, and discuss the findings and their implications for research and practice. Our findings suggest that the critical areas of concern are standardization of modeling approaches, identification of the value proposition of business process modeling, and model-driven process execution. These areas are also expected to persist as business process modeling roadblocks in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standardization is critical to scientists and regulators to ensure the quality and interoperability of research processes, as well as the safety and efficacy of the attendant research products. This is perhaps most evident in the case of “omics science,” which is enabled by a host of diverse high-throughput technologies such as genomics, proteomics, and metabolomics. But standards are of interest to (and shaped by) others far beyond the immediate realm of individual scientists, laboratories, scientific consortia, or governments that develop, apply, and regulate them. Indeed, scientific standards have consequences for the social, ethical, and legal environment in which innovative technologies are regulated, and thereby command the attention of policy makers and citizens. This article argues that standardization of omics science is both technical and social. A critical synthesis of the social science literature indicates that: (1) standardization requires a degree of flexibility to be practical at the level of scientific practice in disparate sites; (2) the manner in which standards are created, and by whom, will impact their perceived legitimacy and therefore their potential to be used; and (3) the process of standardization itself is important to establishing the legitimacy of an area of scientific research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To execute good design one not only needs to know what to do and how to do it, but also why it should be done. For a standardization expert the rationale of a standardization project may be found in the proposal for a new work item or terms of reference, but rarely in the scope statement. However, it is also commonplace that the rationale of the project is not clearly stated in any of these parts. If the rationale is not surfaced in the early phases of a project, it is left to the design, sense-making and negotiation cycles of the design process to orient the project towards a goal. This paper explores how scope statements are used to position standardization projects in the IT for Learning, Education and Training (ITLET) domain, and how scope and rationale are understood in recent projects in European and international standardization. Based on two case-studies the paper suggests some actions for further research and improvement of the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process-Aware Information Systems (PAISs) support executions of operational processes that involve people, resources, and software applications on the basis of process models. Process models describe vast, often infinite, amounts of process instances, i.e., workflows supported by the systems. With the increasing adoption of PAISs, large process model repositories emerged in companies and public organizations. These repositories constitute significant information resources. Accurate and efficient retrieval of process models and/or process instances from such repositories is interesting for multiple reasons, e.g., searching for similar models/instances, filtering, reuse, standardization, process compliance checking, verification of formal properties, etc. This paper proposes a technique for indexing process models that relies on their alternative representations, called untanglings. We show the use of untanglings for retrieval of process models based on process instances that they specify via a solution to the total executability problem. Experiments with industrial process models testify that the proposed retrieval approach is up to three orders of magnitude faster than the state of the art.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents novel techniques for addressing the problems of continuous change and inconsistencies in large process model collections. The developed techniques treat process models as a collection of fragments and facilitate version control, standardization and automated process model discovery using fragment-based concepts. Experimental results show that the presented techniques are beneficial in consolidating large process model collections, specifically when there is a high degree of redundancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In bovines characterization of biochemical and molecular determinants of the dominant follicle before and during different time intervals after gonadotrophin surge requires precise identification of the dominant follicle from a follicular wave. The objectives of the present study were to standardize an experimental model in buffalo cows for accurately identifying the dominant follicle of the first wave of follicular growth and characterize changes in follicular fluid hormone concentrations as well as expression patterns of various genes associated with the process of ovulation. From the day of estrus (day 0), animals were subjected to blood sampling and ultrasonography for monitoring circulating progesterone levels and follicular growth. On day 7 of the cycle, animals were administered a PGF2α analogue (Tiaprost Trometamol, 750 μg i.m.) followed by an injection of hCG (2000 IU i.m.) 36 h later. Circulating progesterone levels progressively increased from day 1 of the cycle to 2.26 ± 0.17 ng/ml on day 7 of the cycle, but declined significantly after PGF2α injection. A progressive increase in the size of the dominant follicle was observed by ultrasonography. The follicular fluid estradiol and progesterone concentrations in the dominant follicle were 600 ± 16.7 and 38 ± 7.6 ng/ml, respectively, before hCG injection and the concentration of estradiol decreased to 125.8 ± 25.26 ng/ml, but concentration of progesterone increased to 195 ± 24.6 ng/ml, 24 h post-hCG injection. Inh-α and Cyp19A1 expressions in granulosa cells were maximal in the dominant follicle and declined in response to hCG treatment. Progesterone receptor, oxytocin and cycloxygenase-2 expressions in granulosa cells, regarded as markers of ovulation, were maximal at 24 h post-hCG. The expressions of genes belonging to the super family of proteases were also examined; Cathepsin L expression decreased, while ADAMTS 3 and 5 expressions increased 24 h post-hCG treatment. The results of the current study indicate that sequential treatments of PGF2α and hCG during early estrous cycle in the buffalo cow leads to follicular growth that culminates in ovulation. The model system reported in the present study would be valuable for examining temporo-spatial changes in the periovulatory follicle immediately before and after the onset of gonadotrophin surge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore how a standardization effort (i.e., when a firm pursues standards to further innovation) involves different search processes for knowledge and innovation outcomes. Using an inductive case study of Vanke, a leading Chinese property developer, we show how varying degrees of knowledge complexity and codification combine to produce a typology of four types of search process: active, integrative, decentralized and passive, resulting in four types of innovation outcome: modular, radical, incremental and architectural. We argue that when the standardization effort in a firm involves highly codified knowledge, incremental and architectural innovation outcomes are fostered, while modular and radical innovations are hindered. We discuss how standardization efforts can result in a second-order innovation capability, and conclude by calling for comparative research in other settings to understand how standardization efforts can be suited to different types of search process in different industry contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term living resource monitoring programs are commonly conducted globally to evaluate trends and impacts of environmental change and management actions. For example, the Woods Hole bottom trawl survey has been conducted since 1963 providing critical information on the biology and distribution of finfish and shellfish in the North Atlantic (Despres-Patango et al. 1988). Similarly in the Chesapeake Bay, the Maryland Department of Natural Resources (MDNR) Summer Blue Crab Trawl survey has been conducted continuously since 1977 providing management-relevant information on the abundance of this important commercial and recreational species. A key component of monitoring program design is standardization of methods over time to allow for a continuous, unbiased data set. However, complete standardization is not always possible where multiple vessels, captains, and crews are required to cover large geographic areas (Tyson et al. 2006). Of equal issue is technological advancement of gear which serves to increase capture efficiency or ease of use. Thus, to maintain consistency and facilitate interpretation of reported data in long-term datasets, it is imperative to understand and quantify the impacts of changes in gear and vessels on catch per unit of effort (CPUE). While vessel changes are inevitable due to ageing fleets and other factors, gear changes often reflect a decision to exploit technological advances. A prime example of this is the otter trawl, a common tool for fisheries monitoring and research worldwide. Historically, trawl nets were constructed of natural materials such as cotton and linen. However modern net construction consists of synthetic materials such as polyamide, polyester, polyethylene, and polypropylene (Nielson et. al. 1983). Over the past several decades, polyamide materials which will be referred to as nylon, has been a standard material used in otter trawl construction. These trawls are typically dipped into a latex coating for increased abrasion resistance, a process that is referred to as “green dipped.” More recently, polyethylene netting has become popular among living resource monitoring agencies. Polyethylene netting, commonly known as sapphire netting, consists of braided filaments that form a very durable material more resistant to abrasion than nylon. Additionally, sapphire netting allows for stronger knot strength during construction of the net further increasing the net’s durability and longevity. Also, sapphire absorbs less water with a specific gravity near 0.91 allowing the material to float as compared to nylon with specific gravity of 1.14 (Nielson et. al. 1983). This same property results in a light weight net which is more efficient in deployment, retrieval and fishing of the net, particularly when towing from small vessels. While there are many advantages to the sapphire netting, no comparative efficiency data is available for these two trawl net types. Traditional nylon netting has been used consistently for decades by the MDDNR to generate long term living resource data sets of great value. However, there is much interest in switching to the advanced materials. In addition, recent collaborative efforts between MDNR and NOAA’s Cooperative Oxford Laboratory (NOAA-COL) require using different vessels for trawling in support of joint projects. In order to continue collaborative programs, or change to more innovative netting materials, the influence of these changes must be demonstrated to be negligible or correction factors determined. Thus, the objective of this study was to examine the influence of trawl net type, vessel type, and their interaction on capture efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Meat to water ratio used for washing was 1:3 for oil sardine and mackerel; but for pink perch and croaker, it was 1:2. Again the washing process was repeated three times for oil sardine and mackerel; but two times for pink perch and croaker. The washed meat was mixed with 2.5% NaC1 and set at +5°C and +40°C for 1, 2 and 3hrs. The gel strength and expressible water content was measured. Basing on this study, setting temperature at +40°C was selected and with respect to time 1hr for sardine and mackerel and 3hrs for pink perch and croaker was selected.