7 resultados para travel time variability, value of time, commuting, stated preference
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
Cured meats and dairy products are criticized for their salt content and synthetic additives. This has led to the development of strategies to reduce and replace these ingredients. Since the food matrix and technological processes can affect the bioaccessibility of nutrients, it is necessary to study their release during digestion to determine the real nutritional value of foods. In the first part of this PhD project, the impact on the nutritional quality of the reduction of sodium content and of the replacement of synthetic nitrates/nitrites with a combination of innovative formulations was evaluated in Parmigiano Reggiano Cheese and salami. For this purpose, an in vitro digestion model combined with different analytical techniques was used. The results showed that fatty acids and proteins release increased over time during digestion. At the end of digestion, the innovative formulation/processing did not negatively affect fatty acids release and protein hydrolysis, and led to the formation of bioactive peptides. The excessive intake of sugars is correlated with metabolic diseases. After the intestinal uptake, their release in the blood stream depends on their metabolic fate within the enterocyte. In the second part of this PhD project, the absorption and metabolism of glucose, fructose and sucrose was evaluated using intestinal cell line. A faster absorption of fructose than glucose was observed, and a different modulation of the synthesis/transport of other metabolites by monosaccharides was shown. Intestinal cells were also used to verify the stability and intestinal uptake of vitamins (A and D3) delivered to cells through two vehicles. It was shown that the presence of lipids protected the vitamin from external factors such as light, heat and oxygen, and improved their bioavailability Overall, the results obtained in this PhD project confirmed that considering only the chemical composition of foods is not sufficient to determine their nutritional value.
Resumo:
Traceability is often perceived by food industry executives as an additional cost of doing business, one to be avoided if possible. However, a traceability system can in fact comply the regulatory requirements, increase food safety and recall performance, improving marketing performances and, as well as, improving supply chain management. Thus, traceability affects business performances of firms in terms of costs and benefits determined by traceability practices. Costs and benefits affect factors such as, firms’ characteristics, level of traceability and ,lastly, costs and benefits perceived prior to traceability implementation. This thesis was undertaken to understand how these factors are linked to affect the outcome of costs and benefits. Analysis of the results of a plant level survey of the Italian ichthyic processing industry revealed that processors generally adopt various level of traceability while government support appears to increase the level of traceability and the expectations and actual costs and benefits. None of the firms’ characteristics, with the exception of government support, influences costs and level of traceability. Only size of firms and level of QMS certifications are linked with benefits while precision of traceability increases benefits without affecting costs. Finally, traceability practices appear due to the request from “external“ stakeholders such as government, authority and customers rather than “internal” factors (e.g. improving the firm management) while the traceability system does not provide any added value from the market in terms of price premium or market share increase.
Resumo:
On May 25, 2018, the EU introduced the General Data Protection Regulation (GDPR) that offers EU citizens a shelter for their personal information by requesting companies to explain how people’s information is used clearly. To comply with the new law, European and non-European companies interacting with EU citizens undertook a massive data re-permission-request campaign. However, if on the one side the EU Regulator was particularly specific in defining the conditions to get customers’ data access, on the other side, it did not specify how the communication between firms and consumers should be designed. This has left firms free to develop their re-permission emails as they liked, plausibly coupling the informative nature of these privacy-related communications with other persuasive techniques to maximize data disclosure. Consequently, we took advantage of this colossal wave of simultaneous requests to provide insights into two issues. Firstly, we investigate how companies across industries and countries chose to frame their requests. Secondly, we investigate which are the factors that influenced the selection of alternative re-permission formats. In order to achieve these goals, we examine the content of a sample of 1506 re-permission emails sent by 1396 firms worldwide, and we identify the dominant “themes” characterizing these emails. We then relate these themes to both the expected benefits firms may derive from data usage and the possible risks they may experience from not being completely compliant to the spirit of the law. Our results show that: (1) most firms enriched their re-permission messages with persuasive arguments aiming at increasing consumers’ likelihood of relinquishing their data; (2) the use of persuasion is the outcome of a difficult tradeoff between costs and benefits; (3) most companies acted in their self-interest and “gamed the system”. Our results have important implications for policymakers, managers, and customers of the online sector.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.