897 resultados para Design For Manufacturing and Assembly
Biomarkers and Bacterial Pneumonia Risk in Patients with Treated HIV Infection: A Case-Control Study
Resumo:
Background: Despite advances in HIV treatment, bacterial pneumonia continues to cause considerable morbidity and mortality in patients with HIV infection. Studies of biomarker associations with bacterial pneumonia risk in treated HIVinfected patients do not currently exist. Methods: We performed a nested, matched, case-control study among participants randomized to continuous combination antiretroviral therapy (cART) in the Strategies for Management of Antiretroviral Therapy trial. Patients who developed bacterial pneumonia (cases) and patients without bacterial pneumonia (controls) were matched 1:1 on clinical center, smoking status, age, and baseline cART use. Baseline levels of Club Cell Secretory Protein 16 (CC16), Surfactant Protein D (SP-D), C-reactive protein (hsCRP), interleukin-6 (IL-6), and d-dimer were compared between cases and controls. Results: Cases (n = 72) and controls (n = 72) were 25.7% female, 51.4% black, 65.3% current smokers, 9.7% diabetic, 36.1% co-infected with Hepatitis B/C, and 75.0% were on cART at baseline. Median (IQR) age was 45 (41, 51) years with CD4+ count of 553 (436, 690) cells/mm3. Baseline CC16 and SP-D were similar between cases and controls, but hsCRP was significantly higher in cases than controls (2.94 mg/mL in cases vs. 1.93 mg/mL in controls; p = 0.02). IL-6 and d-dimer levels were also higher in cases compared to controls, though differences were not statistically significant (p-value 0.06 and 0.10, respectively). Conclusions: In patients with cART-treated HIV infection, higher levels of systemic inflammatory markers were associated with increased bacterial pneumonia risk, while two pulmonary-specific inflammatory biomarkers, CC16 and SP-D, were not associated with bacterial pneumonia risk.
Resumo:
Background Immunosuppressed individuals present serious morbidity and mortality from influenza, therefore it is important to understand the safety and immunogenicity of influenza vaccination among them. Methods This multicenter cohort study evaluated the immunogenicity and reactogenicity of an inactivated, monovalent, non-adjuvanted pandemic (H1N1) 2009 vaccine among the elderly, HIV-infected, rheumatoid arthritis (RA), cancer, kidney transplant, and juvenile idiopathic arthritis (JIA) patients. Participants were included during routine clinical visits, and vaccinated according to conventional influenza vaccination schedules. Antibody response was measured by the hemagglutination-inhibition assay, before and 21 days after vaccination. Results 319 patients with cancer, 260 with RA, 256 HIV-infected, 149 elderly individuals, 85 kidney transplant recipients, and 83 with JIA were included. The proportions of seroprotection, seroconversion, and the geometric mean titer ratios postvaccination were, respectively: 37.6%, 31.8%, and 3.2 among kidney transplant recipients, 61.5%, 53.1%, and 7.5 among RA patients, 63.1%, 55.7%, and 5.7 among the elderly, 59.0%, 54.7%, and 5.9 among HIV-infected patients, 52.4%, 49.2%, and 5.3 among cancer patients, 85.5%, 78.3%, and 16.5 among JIA patients. The vaccine was well tolerated, with no reported severe adverse events. Conclusions The vaccine was safe among all groups, with an acceptable immunogenicity among the elderly and JIA patients, however new vaccination strategies should be explored to improve the immune response of immunocompromised adult patients. (ClinicalTrials.gov, NCT01218685)
Resumo:
Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.
Resumo:
Intangible resources have raised the interests of scholars from different research areas due to their importance as crucial factors for firm performance; yet, contributions to this field still lack a theoretical framework. This research analyses the state-of-the-art results reached in the literature concerning intangibles, their main features and evaluation problems and models. In search for a possible theoretical framework, the research draws a kind of indirect analysis of intangibles through the theories of the firm, their critic and developments. The heterodox approaches of the evolutionary theory and resource-based view are indicated as possible frameworks. Based on this theoretical analysis, organization capital (OC) is identified, for its features, as the most important intangible for firm performance. Empirical studies on the relationship intangibles-firm performance have been sporadic and have failed to reach firm conclusions with respect to OC; in the attempt to fill this gap, the effect of OC is tested on a large sample of European firms using the Compustat Global database. OC is proxied by capitalizing an income statement item (Selling, General and Administrative expenses) that includes expenses linked to information technology, business process design, reputation enhancement and employee training. This measure of OC is employed in a cross-sectional estimation of a firm level production function - modeled with different functional specifications (Cobb-Douglas and Translog) - that measures OC contribution to firm output and profitability. Results are robust and confirm the importance of OC for firm performance.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.
Resumo:
Synchronization is a key issue in any communication system, but it becomes fundamental in the navigation systems, which are entirely based on the estimation of the time delay of the signals coming from the satellites. Thus, even if synchronization has been a well known topic for many years, the introduction of new modulations and new physical layer techniques in the modern standards makes the traditional synchronization strategies completely ineffective. For this reason, the design of advanced and innovative techniques for synchronization in modern communication systems, like DVB-SH, DVB-T2, DVB-RCS, WiMAX, LTE, and in the modern navigation system, like Galileo, has been the topic of the activity. Recent years have seen the consolidation of two different trends: the introduction of Orthogonal Frequency Division Multiplexing (OFDM) in the communication systems, and of the Binary Offset Carrier (BOC) modulation in the modern Global Navigation Satellite Systems (GNSS). Thus, a particular attention has been given to the investigation of the synchronization algorithms in these areas.
Resumo:
This PhD thesis reports on car fluff management, recycling and recovery. Car fluff is the residual waste produced by car recycling operations, particularly from hulk shredding. Car fluff is known also as Automotive Shredder Residue (ASR) and it is made of plastics, rubbers, textiles, metals and other materials, and it is very heterogeneous both in its composition and in its particle size. In fact, fines may amount to about 50%, making difficult to sort out recyclable materials or exploit ASR heat value by energy recovery. This 3 years long study started with the definition of the Italian End-of-Life Vehicles (ELVs) recycling state of the art. A national recycling trial revealed Italian recycling rate to be around 81% in 2008, while European Community recycling target are set to 85% by 2015. Consequently, according to Industrial Ecology framework, a life cycle assessment (LCA) has been conducted revealing that sorting and recycling polymers and metals contained in car fluff, followed by recovering residual energy, is the route which has the best environmental perspective. This results led the second year investigation that involved pyrolysis trials on pretreated ASR fractions aimed at investigating which processes could be suitable for an industrial scale ASR treatment plant. Sieving followed by floatation reported good result in thermochemical conversion of polymers with polyolefins giving excellent conversion rate. This factor triggered ecodesign considerations. Ecodesign, together with LCA, is one of the Industrial Ecology pillars and it consists of design for recycling and design for disassembly, both aimed at the improvement of car components dismantling speed and the substitution of non recyclable material. Finally, during the last year, innovative plants and technologies for metals recovery from car fluff have been visited and tested worldwide in order to design a new car fluff treatment plant aimed at ASR energy and material recovery.
Resumo:
Computer simulations play an ever growing role for the development of automotive products. Assembly simulation, as well as many other processes, are used systematically even before the first physical prototype of a vehicle is built in order to check whether particular components can be assembled easily or whether another part is in the way. Usually, this kind of simulation is limited to rigid bodies. However, a vehicle contains a multitude of flexible parts of various types: cables, hoses, carpets, seat surfaces, insulations, weatherstrips... Since most of the problems using these simulations concern one-dimensional components and since an intuitive tool for cable routing is still needed, we have chosen to concentrate on this category, which includes cables, hoses and wiring harnesses. In this thesis, we present a system for simulating one dimensional flexible parts such as cables or hoses. The modeling of bending and torsion follows the Cosserat model. For this purpose we use a generalized spring-mass system and describe its configuration by a carefully chosen set of coordinates. Gravity and contact forces as well as the forces responsible for length conservation are expressed in Cartesian coordinates. But bending and torsion effects can be dealt with more effectively by using quaternions to represent the orientation of the segments joining two neighboring mass points. This augmented system allows an easy formulation of all interactions with the best appropriate coordinate type and yields a strongly banded Hessian matrix. An energy minimizing process accounts for a solution exempt from the oscillations that are typical of spring-mass systems. The use of integral forces, similar to an integral controller, allows to enforce exactly the constraints. The whole system is numerically stable and can be solved at interactive frame rates. It is integrated in the DaimlerChrysler in-house Virtual Reality Software veo for use in applications such as cable routing and assembly simulation and has been well received by users. Parts of this work have been published at the ACM Solid and Physical Modeling Conference 2006 and have been selected for the special issue of the Computer-Aided-Design Journal to the conference.
Resumo:
This collection of essays examines various aspects of regional development and the issues of internationalization. The first essay investigates the implications of the impressive growth of China from a rural-urban perspective and addresses the topic of convergence in China by employing a non-parametrical approach to study the distribution dynamics of per capita income at province, rural and urban levels. To better understand the degree of inequality characterizing China and the long-term predictions of convergence or divergence of its different territorial aggregations, the second essay formulates a composite indicator of Regional Development (RDI) to benchmark development at province and sub-province level. The RDI goes beyond the uni-dimensional concept of development, generally proxied by the GDP per capita, and gives attention to the rural-urban dimension. The third essay “Internationalization and Trade Specialization in Italy. The role of China in the international intra-firm trade of the Italian regions” - deals with another aspect of regional economic development: the progressive de-industrialisation and de-localization of the local production. This essay looks at the trade specialization of selected Italian regions (those regions specialized in manufacturing) and the fragmentation of the local production on a global scale. China represents in this context an important stakeholder and the paper documents the importance of this country in the regional intra-firm trade.
Resumo:
Life Cycle Assessment (LCA) is a chain-oriented tool to evaluate the environment performance of products focussing on the entire life cycle of these products: from the extraction of resources, via manufacturing and use, to the final processing of the disposed products. Through all these stages consumption of resources and pollutant releases to air, water, soil are identified and quantified in Life Cycle Inventory (LCI) analysis. Subsequently to the LCI phase follows the Life Cycle Impact Assessment (LCIA) phase; that has the purpose to convert resource consumptions and pollutant releases in environmental impacts. The LCIA aims to model and to evaluate environmental issues, called impact categories. Several reports emphasises the importance of LCA in the field of ENMs. The ENMs offer enormous potential for the development of new products and application. There are however unanswered questions about the impacts of ENMs on human health and the environment. In the last decade the increasing production, use and consumption of nanoproducts, with a consequent release into the environment, has accentuated the obligation to ensure that potential risks are adequately understood to protect both human health and environment. Due to its holistic and comprehensive assessment, LCA is an essential tool evaluate, understand and manage the environmental and health effects of nanotechnology. The evaluation of health and environmental impacts of nanotechnologies, throughout the whole of their life-cycle by using LCA methodology. This is due to the lack of knowledge in relation to risk assessment. In fact, to date, the knowledge on human and environmental exposure to nanomaterials, such ENPs is limited. This bottleneck is reflected into LCA where characterisation models and consequently characterisation factors for ENPs are missed. The PhD project aims to assess limitations and challenges of the freshwater aquatic ecotoxicity potential evaluation in LCIA phase for ENPs and in particular nanoparticles as n-TiO2.
Resumo:
Nanotechnology entails the manufacturing and manipulation of matter at length scales ranging from single atoms to micron-sized objects. The ability to address properties on the biologically-relevant nanometer scale has made nanotechnology attractive for Nanomedicine. This is perceived as a great opportunity in healthcare especially in diagnostics, therapeutics and more in general to develop personalized medicine. Nanomedicine has the potential to enable early detection and prevention, and to improve diagnosis, mass screening, treatment and follow-up of many diseases. From the biological standpoint, nanomaterials match the typical size of naturally occurring functional units or components of living organisms and, for this reason, enable more effective interaction with biological systems. Nanomaterials have the potential to influence the functionality and cell fate in the regeneration of organs and tissues. To this aim, nanotechnology provides an arsenal of techniques for intervening, fabricate, and modulate the environment where cells live and function. Unconventional micro- and nano-fabrication techniques allow patterning biomolecules and biocompatible materials down to the level of a few nanometer feature size. Patterning is not simply a deterministic placement of a material; in a more extended acception it allows a controlled fabrication of structures and gradients of different nature. Gradients are emerging as one of the key factors guiding cell adhesion, proliferation, migration and even differentiation in the case of stem cells. The main goal of this thesis has been to devise a nanotechnology-based strategy and tools to spatially and temporally control biologically-relevant phenomena in-vitro which are important in some fields of medical research.
Resumo:
MFA and LCA methodologies were applied to analyse the anthropogenic aluminium cycle in Italy with focus on historical evolution of stocks and flows of the metal, embodied GHG emissions, and potentials from recycling to provide key features to Italy for prioritizing industrial policy toward low-carbon technologies and materials. Historical trend series were collected from 1947 to 2009 and balanced with data from production, manufacturing and waste management of aluminium-containing products, using a ‘top-down’ approach to quantify the contemporary in-use stock of the metal, and helping to identify ‘applications where aluminium is not yet being recycled to its full potential and to identify present and future recycling flows’. The MFA results were used as a basis for the LCA aimed at evaluating the carbon footprint evolution, from primary and electrical energy, the smelting process and the transportation, embodied in the Italian aluminium. A discussion about how the main factors, according to the Kaya Identity equation, they did influence the Italian GHG emissions pattern over time, and which are the levers to mitigate it, it has been also reported. The contemporary anthropogenic reservoirs of aluminium was estimated at about 320 kg per capita, mainly embedded within the transportation and building and construction sectors. Cumulative in-use stock represents approximately 11 years of supply at current usage rates (about 20 Mt versus 1.7 Mt/year), and it would imply a potential of about 160 Mt of CO2eq emissions savings. A discussion of criticality related to aluminium waste recovery from the transportation and the containers and packaging sectors was also included in the study, providing an example for how MFA and LCA may support decision-making at sectorial or regional level. The research constitutes the first attempt of an integrated approach between MFA and LCA applied to the aluminium cycle in Italy.
Resumo:
One important metaphor, referred to biological theories, used to investigate on organizational and business strategy issues is the metaphor about heredity; an area requiring further investigation is the extent to which the characteristics of blueprints inherited from the parent, helps in explaining subsequent development of the spawned ventures. In order to shed a light on the tension between inherited patterns and the new trajectory that may characterize spawned ventures’ development we propose a model aimed at investigating which blueprints elements might exert an effect on business model design choices and to which extent their persistence (or abandonment) determines subsequent business model innovation. Under the assumption that academic and corporate institutions transmit different genes to their spin-offs, we hence expect to have heterogeneity in elements that affect business model design choices and its subsequent evolution. This is the reason why we carry on a twofold analysis in the biotech (meta)industry: under a multiple-case research design, business model and especially its fundamental design elements and themes scholars individuated to decompose the construct, have been thoroughly analysed. Our purpose is to isolate the dimensions of business model that may have been the object of legacy and the ones along which an experimentation and learning process is more likely to happen, bearing in mind that differences between academic and corporate might not be that evident as expected, especially considering that business model innovation may occur.
Resumo:
Natural hazards affecting industrial installations could directly or indirectly cause an accident or series of accidents with serious consequences for the environment and for human health. Accidents initiated by a natural hazard or disaster which result in the release of hazardous materials are commonly referred to as Natech (Natural Hazard Triggering a Technological Disaster) accidents. The conditions brought about by these kinds of events are particularly problematic, the presence of the natural event increases the probability of exposition and causes consequences more serious than standard technological accidents. Despite a growing body of research and more stringent regulations for the design and operation of industrial activities, Natech accidents remain a threat. This is partly due to the absence of data and dedicated risk-assessment methodologies and tools. Even the Seveso Directives for the control of risks due to major accident hazards do not include any specific impositions regarding the management of Natech risks in the process industries. Among the few available tools there is the European Standard EN 62305, which addresses generic industrial sites, requiring to take into account the possibility of lightning and to select the appropriate protection measures. Since it is intended for generic industrial installations, this tool set the requirements for the design, the construction and the modification of structures, and is thus mainly oriented towards conventional civil building. A first purpose of this project is to study the effects and the consequences on industrial sites of lightning, which is the most common adverse natural phenomenon in Europe. Lightning is the cause of several industrial accidents initiated by natural causes. The industrial sectors most susceptible to accidents triggered by lightning is the petrochemical one, due to the presence of atmospheric tanks (especially floating roof tanks) containing flammable vapors which could be easily ignited by a lightning strike or by lightning secondary effects (as electrostatic and electromagnetic pulses or ground currents). A second purpose of this work is to implement the procedure proposed by the European Standard on a specific kind of industrial plant, i.e. on a chemical factory, in order to highlight the critical aspects of this implementation. A case-study plant handling flammable liquids was selected. The application of the European Standard allowed to estimate the incidence of lightning activity on the total value of the default release frequency suggested by guidelines for atmospheric storage tanks. Though it has become evident that the European Standard does not introduce any parameters explicitly pointing out the amount of dangerous substances which could be ignited or released. Furthermore the parameters that are proposed to describe the characteristics of the structures potentially subjected to lightning strikes are insufficient to take into account the specific features of different chemical equipment commonly present in chemical plants.