2 resultados para measurement of time interval
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
The principal objective of this thesis was to investigate the ability of reversible optical O2 sensors to be incorporated into food/beverage packaging systems to continuously monitor O2 levels in a non-destructive manner immediately postpackaging and over time. Residual levels of O2 present in packs can negatively affect product quality and subsequently, product shelf-life, especially for O2-sensitive foods/beverages. Therefore, the ability of O2 sensors to continuously monitor O2 levels present within food/beverage packages was considered commercially relevant in terms of identifying the consequences of residual O2 on product safety and quality over time. Research commenced with the development of a novel range of O2 sensors based on phosphorescent platinum and palladium octaethylporphyrin-ketones (OEPk) in nano-porous high density polyethylene (HDPE), polypropylene (PP) polytetrafluoroethylene (PTFE) polymer supports. Sensors were calibrated over a temperature range of -10°C to +40°C and deemed suitable for food and beverage packaging applications. This sensor technology was used and demonstrated itself effective in determining failures in packaging containment. This was clearly demonstrated in the packaging of cheese string products. The sensor technology was also assessed across a wide range of packaged products; beer, ready-to-eat salad products, bread and convenience-style, muscle-based processed food products. The O2 sensor technology performed extremely well within all packaging systems. The sensor technology adequately detected O2 levels in; beer bottles prior to and following pasteurisation, modified atmosphere (MA) packs of ready-to-eat salad packs as respiration progressed during product storage and MA packs of bread and convenience-style muscle-based products as mycological growth occurred in food packs over time in the presence and absence of ethanol emitters. The use of the technology, in conjunction with standard food quality assessment techniques, showed remarkable usefulness in determining the impact of actual levels of O2 on specific quality attributes. The O2 sensing probe was modified, miniaturised and automated to screen for the determination of total aerobic viable counts (TVC) in several fish species samples. The test showed good correlation with conventional TVC test (ISO:4833:2003), analytical performance and ruggedness with respect to variation of key assay parameters (probe concentration and pipetting volume). Overall, the respirometric fish TVC test was simple to use, possessed a dynamic microbial range (104-107 cfu/g sample), had an accuracy of +/- one log(cfu/g sample) and was rapid. Its ability to assess highly perishable products such as fish for total microbial growth in <12 hr demonstrates commercial potential.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain