842 resultados para Technologies forprevention of accidents
Resumo:
This study examines the use of Cybercafés/Internet resources and the evaluation of their usefulness. About eight Cybercafés located in the university community were used in this study. Questionnaires, interviews with the Cybercafé owners, staff and users as well as personal observations made during inspection of these cafés were used in this study. The data were analysed according to the background of the Internet users. The richness and high speed, accuracy, and authority were used by users to judge the quality of the Internet. Information such as the establishment of the café's facilities, membership and the future of the Cybercafés were also looked into. Finally, one can clearly see that the dominating impact of digital technology has crossed the Rubicon of controversy. The result of the survey shows that forty percent of the users learnt to use the internet by self instruction, thirty five percent learnt from colleagues or friends. Those in the sciences use the internet the most, the channel mostly used in obtaining information is the search engines. A large number of students, faculties and researchers make use of the internet in obtaining information. Many of those of those users make use of the Cybercafés in the university community.
Resumo:
Objective: The objective of this research was to study the influence of the use of helmet in facial trauma victims of motorcycle accidents with moderate traumatic brain injury. Methods: We retrospectively reviewed the incidence of facial injuries in helmeted and nonhelmeted victims with moderate traumatic brain injury at a referral trauma hospital. Results: The sample consisted of 272 patients predominantly men (94.5%) and between 21 and 40 years old (62.9%). The majority of patients were using helmet (80.1%). The occurrence of facial fractures was most frequent for zygomatic bone (51.8%), followed by mandible (18.8%) and nasal bones (9.2%). Conclusions: Individuals in the most productive age group are most affected, which causes a great loss to financial and labor systems. It is important to take measures to alert the public regarding the severity of injuries likely to occur in motorcycle-related accidents and ways to prevent them.
Resumo:
In 2008, academic researchers and public service officials created a university extension studies platform based on online and on-site meetings denominated "Work-Related Accidents Forum: Analysis, Prevention, and Other Relevant Aspects. Its aim was to help public agents and social partners to propagate a systemic approach that would be helpful in the surveillance and prevention of work-related accidents. This article describes and analyses such a platform. Online access is free and structured to: support dissemination of updated concepts; support on-site meetings and capacity to build educational activities; and keep a permanent space for debate among the registered participants. The desired result is the propagation of a social-technical-systemic view of work-related accidents that replaces the current traditional view that emphasizes human error and results in blaming the victims. The Forum uses an educational approach known as permanent health education, which is based on the experience and needs of workers and encourages debate among participants. The forum adopts a problematizing pedagogy that starts from the requirements and experiences of the social actors and stimulates support and discussions among them in line with an ongoing health educational approach. The current challenge is to turn the platform into a social networking website in order to broaden its links with society.
Resumo:
Abstract Background The development of protocols for RNA extraction from paraffin-embedded samples facilitates gene expression studies on archival samples with known clinical outcome. Older samples are particularly valuable because they are associated with longer clinical follow up. RNA extracted from formalin-fixed paraffin-embedded (FFPE) tissue is problematic due to chemical modifications and continued degradation over time. We compared quantity and quality of RNA extracted by four different protocols from 14 ten year old and 14 recently archived (three to ten months old) FFPE breast cancer tissues. Using three spin column purification-based protocols and one magnetic bead-based protocol, total RNA was extracted in triplicate, generating 336 RNA extraction experiments. RNA fragment size was assayed by reverse transcription-polymerase chain reaction (RT-PCR) for the housekeeping gene glucose-6-phosphate dehydrogenase (G6PD), testing primer sets designed to target RNA fragment sizes of 67 bp, 151 bp, and 242 bp. Results Biologically useful RNA (minimum RNA integrity number, RIN, 1.4) was extracted in at least one of three attempts of each protocol in 86–100% of older and 100% of recently archived ("months old") samples. Short RNA fragments up to 151 bp were assayable by RT-PCR for G6PD in all ten year old and months old tissues tested, but none of the ten year old and only 43% of months old samples showed amplification if the targeted fragment was 242 bp. Conclusion All protocols extracted RNA from ten year old FFPE samples with a minimum RIN of 1.4. Gene expression of G6PD could be measured in all samples, old and recent, using RT-PCR primers designed for RNA fragments up to 151 bp. RNA quality from ten year old FFPE samples was similar to that extracted from months old samples, but quantity and success rate were generally higher for the months old group. We preferred the magnetic bead-based protocol because of its speed and higher quantity of extracted RNA, although it produced similar quality RNA to other protocols. If a chosen protocol fails to extract biologically useful RNA from a given sample in a first attempt, another attempt and then another protocol should be tried before excluding the case from molecular analysis.
Resumo:
This PhD thesis has been proposed to validate and then apply innovative analytical methodologies for the determination of compounds with harmful impact on human health, such as biogenic amines and ochratoxin A in wines. Therefore, the influence of production technology (pH, amino acids precursor and use of different malolactic starters) on biogenic amines content in wines was evaluated. An HPLC method for simultaneous determination of amino acids and amines with precolumnderivatization with 9-Fluorenyl-methoxycarbonyl chloride (FMOC-Cl) and UV detection was developed. Initially, the influence of pH, time of derivatization, gradient profile were studied. In order to improve the separation of amino acids and amines and reduce the time of analysis, it was decided to study the influence of different flows and the use of different columns in the chromatographic method. Firstly, a C18 Luna column was used and later two monolithic columns Chromolith in series. It appeared to be suitable for an easy, precise and accurate determination of a relatively large number of amino acids and amines in wines. This method was then applied on different wines produced in the Emilia Romagna region. The investigation permitted to discriminate between red and white wines. Amino acids content is related to the winemaking process. Biogenic amines content in these wines does not represent a possible toxicological problem for human health. The results of the study of influence of technologies and wine composition demonstrated that pH of wines and amino acids content are the most important factors. Particularly wines with pH > 3,5 show higher concentration of biogenic amines than wines with lower pH. The enrichment of wines by nutrients also influences the content of some biogenic amines that are higher in wines added with amino acids precursors. In this study, amino acids and biogenic amines are not statistically affected by strain of lactic acid bacteria inoculated as a starter for malolactic fermentation. An evaluation of different clean-up (SPE-MycoSep; IACs and LLE) and determination methods (HPLC and ELISA) of ochratoxin A was carried out. The results obtained proved that the SPE clean-up are reliable at the same level while the LLE procedures shows lowest recovery. The ELISA method gave a lower determination and a low reproducibility than HPLC method.
Resumo:
Food technologies today mean reducing agricultural food waste, improvement of food security, enhancement of food sensory properties, enlargement of food market and food economies. Food technologists must be high-skilled technicians with good scientific knowledge of food hygiene, food chemistry, industrial technologies and food engineering, sensory evaluation experience and analytical chemistry. Their role is to apply the modern vision of science in the field of human nutrition, rising up knowledge in food science. The present PhD project starts with the aim of studying and improving frozen fruits quality. Freezing process in very powerful in preserve initial raw material characteristics, but pre-treatment before the freezing process are necessary to improve quality, in particular to improve texture and enzymatic activity of frozen foods. Osmotic Dehydration (OD) and Vacuum Impregnation (VI), are useful techniques to modify fruits and vegetables composition and prepare them to freezing process. These techniques permit to introduce cryo-protective agent into the food matrices, without significant changes of the original structure, but cause a slight leaching of important intrinsic compounds. Phenolic and polyphenolic compounds for example in apples and nectarines treated with hypertonic solutions are slightly decreased, but the effect of concentration due to water removal driven out from the osmotic gradient, cause a final content of phenolic compounds similar to that of the raw material. In many experiment, a very important change in fruit composition regard the aroma profile. This occur in strawberries osmo-dehydrated under vacuum condition or under atmospheric pressure condition. The increment of some volatiles, probably due to fermentative metabolism induced by the osmotic stress of hypertonic treatment, induce a sensory profile modification of frozen fruits, that in some way result in a better acceptability of consumer, that prefer treated frozen fruits to untreated frozen fruits. Among different processes used, a very interesting result was obtained with the application of a osmotic pre-treatment driven out at refrigerated temperature for long time. The final quality of frozen strawberries was very high and a peculiar increment of phenolic profile was detected. This interesting phenomenon was probably due to induction of phenolic biological synthesis (for example as reaction to osmotic stress), or to hydrolysis of polymeric phenolic compounds. Aside this investigation in the cryo-stabilization and dehydrofreezing of fruits, deeper investigation in VI techniques were carried out, as studies of changes in vacuum impregnated prickly pear texture, and in use of VI and ultrasound (US) in aroma enrichment of fruit pieces. Moreover, to develop sensory evaluation tools and analytical chemistry determination (of volatiles and phenolic compounds), some researches were bring off and published in these fields. Specifically dealing with off-flavour development during storage of boiled potato, and capillary zonal electrophoresis (CZE) and high performance liquid chromatography (HPLC) determination of phenolic compounds.
Resumo:
The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.
Resumo:
Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
The meaning of a place has been commonly assigned to the quality of having root (rootedness) or sense of belonging to that setting. While on the contrary, people are nowadays more concerned with the possibilities of free moving and networks of communication. So, the meaning, as well as the materiality of architecture has been dramatically altered with these forces. It is therefore of significance to explore and redefine the sense and the trend of architecture at the age of flow. In this dissertation, initially, we review the gradually changing concept of "place-non-place" and its underlying technological basis. Then we portray the transformation of meaning of architecture as influenced by media and information technology and advanced methods of mobility, in the dawn of 21st century. Against such backdrop, there is a need to sort and analyze architectural practices in response to the triplet of place-non-place and space of flow, which we plan to achieve conclusively. We also trace the concept of flow in the process of formation and transformation of old cities. As a brilliant case study, we look at Persian Bazaar from a socio-architectural point of view. In other word, based on Robert Putnam's theory of social capital, we link social context of the Bazaar with architectural configuration of cities. That is how we believe "cities as flow" are not necessarily a new paradigm.
Resumo:
Waste management represents an important issue in our society and Waste-to-Energy incineration plants have been playing a significant role in the last decades, showing an increased importance in Europe. One of the main issues posed by waste combustion is the generation of air contaminants. Particular concern is present about acid gases, mainly hydrogen chloride and sulfur oxides, due to their potential impact on the environment and on human health. Therefore, in the present study the main available technological options for flue gas treatment were analyzed, focusing on dry treatment systems, which are increasingly applied in Municipal Solid Wastes (MSW) incinerators. An operational model was proposed to describe and optimize acid gas removal process. It was applied to an existing MSW incineration plant, where acid gases are neutralized in a two-stage dry treatment system. This process is based on the injection of powdered calcium hydroxide and sodium bicarbonate in reactors followed by fabric filters. HCl and SO2 conversions were expressed as a function of reactants flow rates, calculating model parameters from literature and plant data. The implementation in a software for process simulation allowed the identification of optimal operating conditions, taking into account the reactant feed rates, the amount of solid products and the recycle of the sorbent. Alternative configurations of the reference plant were also assessed. The applicability of the operational model was extended developing also a fundamental approach to the issue. A predictive model was developed, describing mass transfer and kinetic phenomena governing the acid gas neutralization with solid sorbents. The rate controlling steps were identified through the reproduction of literature data, allowing the description of acid gas removal in the case study analyzed. A laboratory device was also designed and started up to assess the required model parameters.
Resumo:
Multifunctional Structures (MFS) represent one of the most promising disruptive technologies in the space industry. The possibility to merge spacecraft primary and secondary structures as well as attitude control, power management and onboard computing functions is expected to allow for mass, volume and integration effort savings. Additionally, this will bring the modular construction of spacecraft to a whole new level, by making the development and integration of spacecraft modules, or building blocks, leaner, reducing lead times from commissioning to launch from the current 3-6 years down to the order of 10 months, as foreseen by the latest Operationally Responsive Space (ORS) initiatives. Several basic functionalities have been integrated and tested in specimens of various natures over the last two decades. However, a more integrated, system-level approach was yet to be developed. The activity reported in this thesis was focused on the system-level approach to multifunctional structures for spacecraft, namely in the context of nano- and micro-satellites. This thesis documents the work undertaken in the context of the MFS program promoted by the European Space Agency under the Technology Readiness Program (TRP): a feasibility study, including specimens manufacturing and testing. The work sequence covered a state of the art review, with particular attention to traditional modular architectures implemented in ALMASat-1 and ALMASat-EO satellites, and requirements definition, followed by the development of a modular multi-purpose nano-spacecraft concept, and finally by the design, integration and testing of integrated MFS specimens. The approach for the integration of several critical functionalities into nano-spacecraft modules was validated and the overall performance of the system was verified through relevant functional and environmental testing at University of Bologna and University of Southampton laboratories.
Resumo:
The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.
Resumo:
This study concerns teachers’ use of digital technologies in student assessment, and how the learning that is developed through the use of technology in mathematics can be evaluated. Nowadays math teachers use digital technologies in their teaching, but not in student assessment. The activities carried out with technology are seen as ‘extra-curricular’ (by both teachers and students), thus students do not learn what they can do in mathematics with digital technologies. I was interested in knowing the reasons teachers do not use digital technology to assess students’ competencies, and what they would need to be able to design innovative and appropriate tasks to assess students’ learning through digital technology. This dissertation is built on two main components: teachers and task design. I analyze teachers’ practices involving digital technologies with Ruthven’s Structuring Features of Classroom Practice, and what relation these practices have to the types of assessment they use. I study the kinds of assessment tasks teachers design with a DGE (Dynamic Geometry Environment), using Laborde’s categorization of DGE tasks. I consider the competencies teachers aim to assess with these tasks, and how their goals relate to the learning outcomes of the curriculum. This study also develops new directions in finding how to design suitable tasks for student mathematical assessment in a DGE, and it is driven by the desire to know what kinds of questions teachers might be more interested in using. I investigate the kinds of technology-based assessment tasks teachers value, and the type of feedback they give to students. Finally, I point out that the curriculum should include a range of mathematical and technological competencies that involve the use of digital technologies in mathematics, and I evaluate the possibility to take advantage of technology feedback to allow students to continue learning while they are taking a test.