16 resultados para analysis to synthesis

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a creative and practical approach to dealing with the problem of selection bias. Selection bias may be the most important vexing problem in program evaluation or in any line of research that attempts to assert causality. Some of the greatest minds in economics and statistics have scrutinized the problem of selection bias, with the resulting approaches – Rubin’s Potential Outcome Approach(Rosenbaum and Rubin,1983; Rubin, 1991,2001,2004) or Heckman’s Selection model (Heckman, 1979) – being widely accepted and used as the best fixes. These solutions to the bias that arises in particular from self selection are imperfect, and many researchers, when feasible, reserve their strongest causal inference for data from experimental rather than observational studies. The innovative aspect of this thesis is to propose a data transformation that allows measuring and testing in an automatic and multivariate way the presence of selection bias. The approach involves the construction of a multi-dimensional conditional space of the X matrix in which the bias associated with the treatment assignment has been eliminated. Specifically, we propose the use of a partial dependence analysis of the X-space as a tool for investigating the dependence relationship between a set of observable pre-treatment categorical covariates X and a treatment indicator variable T, in order to obtain a measure of bias according to their dependence structure. The measure of selection bias is then expressed in terms of inertia due to the dependence between X and T that has been eliminated. Given the measure of selection bias, we propose a multivariate test of imbalance in order to check if the detected bias is significant, by using the asymptotical distribution of inertia due to T (Estadella et al. 2005) , and by preserving the multivariate nature of data. Further, we propose the use of a clustering procedure as a tool to find groups of comparable units on which estimate local causal effects, and the use of the multivariate test of imbalance as a stopping rule in choosing the best cluster solution set. The method is non parametric, it does not call for modeling the data, based on some underlying theory or assumption about the selection process, but instead it calls for using the existing variability within the data and letting the data to speak. The idea of proposing this multivariate approach to measure selection bias and test balance comes from the consideration that in applied research all aspects of multivariate balance, not represented in the univariate variable- by-variable summaries, are ignored. The first part contains an introduction to evaluation methods as part of public and private decision process and a review of the literature of evaluation methods. The attention is focused on Rubin Potential Outcome Approach, matching methods, and briefly on Heckman’s Selection Model. The second part focuses on some resulting limitations of conventional methods, with particular attention to the problem of how testing in the correct way balancing. The third part contains the original contribution proposed , a simulation study that allows to check the performance of the method for a given dependence setting and an application to a real data set. Finally, we discuss, conclude and explain our future perspectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis objectives are to develop new methodologies for study of the space and time variability of Italian upper ocean ecosystem through the combined use of multi-sensors satellite data and in situ observations and to identify the capability and limits of remote sensing observations to monitor the marine state at short and long time scales. Three oceanographic basins have been selected and subjected to different types of analyses. The first region is the Tyrrhenian Sea where a comparative analysis of altimetry and lagrangian measurements was carried out to study the surface circulation. The results allowed to deepen the knowledge of the Tyrrhenian Sea surface dynamics and its variability and to defined the limitations of satellite altimetry measurements to detect small scale marine circulation features. Channel of Sicily study aimed to identify the spatial-temporal variability of phytoplankton biomass and to understand the impact of the upper ocean circulation on the marine ecosystem. An combined analysis of the satellite of long term time series of chlorophyll, Sea Surface Temperature and Sea Level field data was applied. The results allowed to identify the key role of the Atlantic water inflow in modulating the seasonal variability of the phytoplankton biomass in the region. Finally, Italian coastal marine system was studied with the objective to explore the potential capability of Ocean Color data in detecting chlorophyll trend in coastal areas. The most appropriated methodology to detect long term environmental changes was defined through intercomparison of chlorophyll trends detected by in situ and satellite. Then, Italian coastal areas subject to eutrophication problems were identified. This work has demonstrated that satellites data constitute an unique opportunity to define the features and forcing influencing the upper ocean ecosystems dynamics and can be used also to monitor environmental variables capable of influencing phytoplankton productivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims at investigating a new approach to document analysis based on the idea of structural patterns in XML vocabularies. My work is founded on the belief that authors do naturally converge to a reasonable use of markup languages and that extreme, yet valid instances are rare and limited. Actual documents, therefore, may be used to derive classes of elements (patterns) persisting across documents and distilling the conceptualization of the documents and their components, and may give ground for automatic tools and services that rely on no background information (such as schemas) at all. The central part of my work consists in introducing from the ground up a formal theory of eight structural patterns (with three sub-patterns) that are able to express the logical organization of any XML document, and verifying their identifiability in a number of different vocabularies. This model is characterized by and validated against three main dimensions: terseness (i.e. the ability to represent the structure of a document with a small number of objects and composition rules), coverage (i.e. the ability to capture any possible situation in any document) and expressiveness (i.e. the ability to make explicit the semantics of structures, relations and dependencies). An algorithm for the automatic recognition of structural patterns is then presented, together with an evaluation of the results of a test performed on a set of more than 1100 documents from eight very different vocabularies. This language-independent analysis confirms the ability of patterns to capture and summarize the guidelines used by the authors in their everyday practice. Finally, I present some systems that work directly on the pattern-based representation of documents. The ability of these tools to cover very different situations and contexts confirms the effectiveness of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative imaging in oncology aims at developing imaging biomarkers for diagnosis and prediction of cancer aggressiveness and therapy response before any morphological change become visible. This Thesis exploits Computed Tomography perfusion (CTp) and multiparametric Magnetic Resonance Imaging (mpMRI) for investigating diverse cancer features on different organs. I developed a voxel-based image analysis methodology in CTp and extended its use to mpMRI, for performing precise and accurate analyses at single-voxel level. This is expected to improve reproducibility of measurements and cancer mechanisms’ comprehension and clinical interpretability. CTp has not entered the clinical routine yet, although its usefulness in the monitoring of cancer angiogenesis, due to different perfusion computing methods yielding unreproducible results. Instead, machine learning applications in mpMRI, useful to detect imaging features representative of cancer heterogeneity, are mostly limited to clinical research, because of results’ variability and difficult interpretability, which make clinicians not confident in clinical applications. In hepatic CTp, I investigated whether, and under what conditions, two widely adopted perfusion methods, Maximum Slope (MS) and Deconvolution (DV), could yield reproducible parameters. To this end, I developed signal processing methods to model the first pass kinetics and remove any numerical cause hampering the reproducibility. In mpMRI, I proposed a new approach to extract local first-order features, aiming at preserving spatial reference and making their interpretation easier. In CTp, I found out the cause of MS and DV non-reproducibility: MS and DV represent two different states of the system. Transport delays invalidate MS assumptions and, by correcting MS formulation, I have obtained the voxel-based equivalence of the two methods. In mpMRI, the developed predictive models allowed (i) detecting rectal cancers responding to neoadjuvant chemoradiation showing, at pre-therapy, sparse coarse subregions with altered density, and (ii) predicting clinically significant prostate cancers stemming from the disproportion between high- and low- diffusivity gland components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high quality of protected designation of origin (PDO) dry-cured pork products depends largely on the chemical and physical parameters of the fresh meat and their variation during the production process of the final product. The discovery of the mechanisms that regulate the variability of these parameters was aided by the reference genome of swine adjuvant to genetic analysis methods. This thesis can contribute to the discovery of genetic mechanisms that regulate the variability of some quality parameters of fresh meat for PDO dry-cured pork production. The first study is of gene expression and showed that between low and high glycolytic potential (GP) samples of Semimembranosus muscle of Italian Large White (ILW) pigs in early postmortem, the differentially expressed genes were all but one over expressed in low GP. These were involved in ATP biosynthesis processes, calcium homeostasis, and lipid metabolism including the potential master regulator gene Peroxisome Proliferator-Activated Receptor Alpha (PPARA). The second is a study in commercial hybrid pigs to evaluate correlations between carcass and fresh ham traits, including carcass and fresh ham lean meat percentages, the former, a potential predictor of the latter. In addition, a genome-wide association study allowed the identification of chromosome-wide associations with phenotypic traits for 19 SNPs, and genome-wide associations for 14 SNPs for ferrochelatase activity. The latter could be a determinant for color variation in nitrite-free dry-cured ham. The third study showed gene expression differences in the Longissimus thoracis muscle of ILW pigs by feeding diets with extruded linseed (source of polyunsaturated fatty acids) and vitamin E and selenium (diet three) or natural (diet four) antioxidants. The diet three promoted a more rapid and massive immune system response possibly determined by improvement in muscle tissue function, while the diet four promoted oxidative stability and increased the anti-inflammatory potential of muscle tissue.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is structured as follows: In Section 1 we discuss the clinical problem of heart failure. In particular, we present the phenomenon known as ventricular mechanical dyssynchrony: its impact on cardiac function, the therapy for its treatment and the methods for its quantification. Specifically, we describe the conductance catheter and its use for the measurement of dyssynchrony. At the end of the Section 1, we propose a new set of indexes to quantify the dyssynchrony that are studied and validated thereafter. In Section 2 we describe the studies carried out in this work: we report the experimental protocols, we present and discuss the results obtained. Finally, we report the overall conclusions drawn from this work and we try to envisage future works and possible clinical applications of our results. Ancillary studies that were carried out during this work mainly to investigate several aspects of cardiac resynchronization therapy (CRT) are mentioned in Appendix. -------- Ventricular mechanical dyssynchrony plays a regulating role already in normal physiology but is especially important in pathological conditions, such as hypertrophy, ischemia, infarction, or heart failure (Chapter 1,2.). Several prospective randomized controlled trials supported the clinical efficacy and safety of cardiac resynchronization therapy (CRT) in patients with moderate or severe heart failure and ventricular dyssynchrony. CRT resynchronizes ventricular contraction by simultaneous pacing of both left and right ventricle (biventricular pacing) (Chapter 1.). Currently, the conductance catheter method has been used extensively to assess global systolic and diastolic ventricular function and, more recently, the ability of this instrument to pick-up multiple segmental volume signals has been used to quantify mechanical ventricular dyssynchrony. Specifically, novel indexes based on volume signals acquired with the conductance catheter were introduced to quantify dyssynchrony (Chapter 3,4.). Present work was aimed to describe the characteristics of the conductancevolume signals, to investigate the performance of the indexes of ventricular dyssynchrony described in literature and to introduce and validate improved dyssynchrony indexes. Morevoer, using the conductance catheter method and the new indexes, the clinical problem of the ventricular pacing site optimization was addressed and the measurement protocol to adopt for hemodynamic tests on cardiac pacing was investigated. In accordance to the aims of the work, in addition to the classical time-domain parameters, a new set of indexes has been extracted, based on coherent averaging procedure and on spectral and cross-spectral analysis (Chapter 4.). Our analyses were carried out on patients with indications for electrophysiologic study or device implantation (Chapter 5.). For the first time, besides patients with heart failure, indexes of mechanical dyssynchrony based on conductance catheter were extracted and studied in a population of patients with preserved ventricular function, providing information on the normal range of such a kind of values. By performing a frequency domain analysis and by applying an optimized coherent averaging procedure (Chapter 6.a.), we were able to describe some characteristics of the conductance-volume signals (Chapter 6.b.). We unmasked the presence of considerable beat-to-beat variations in dyssynchrony that seemed more frequent in patients with ventricular dysfunction and to play a role in discriminating patients. These non-recurrent mechanical ventricular non-uniformities are probably the expression of the substantial beat-to-beat hemodynamic variations, often associated with heart failure and due to cardiopulmonary interaction and conduction disturbances. We investigated how the coherent averaging procedure may affect or refine the conductance based indexes; in addition, we proposed and tested a new set of indexes which quantify the non-periodic components of the volume signals. Using the new set of indexes we studied the acute effects of the CRT and the right ventricular pacing, in patients with heart failure and patients with preserved ventricular function. In the overall population we observed a correlation between the hemodynamic changes induced by the pacing and the indexes of dyssynchrony, and this may have practical implications for hemodynamic-guided device implantation. The optimal ventricular pacing site for patients with conventional indications for pacing remains controversial. The majority of them do not meet current clinical indications for CRT pacing. Thus, we carried out an analysis to compare the impact of several ventricular pacing sites on global and regional ventricular function and dyssynchrony (Chapter 6.c.). We observed that right ventricular pacing worsens cardiac function in patients with and without ventricular dysfunction unless the pacing site is optimized. CRT preserves left ventricular function in patients with normal ejection fraction and improves function in patients with poor ejection fraction despite no clinical indication for CRT. Moreover, the analysis of the results obtained using new indexes of regional dyssynchrony, suggests that pacing site may influence overall global ventricular function depending on its relative effects on regional function and synchrony. Another clinical problem that has been investigated in this work is the optimal right ventricular lead location for CRT (Chapter 6.d.). Similarly to the previous analysis, using novel parameters describing local synchrony and efficiency, we tested the hypothesis and we demonstrated that biventricular pacing with alternative right ventricular pacing sites produces acute improvement of ventricular systolic function and improves mechanical synchrony when compared to standard right ventricular pacing. Although no specific right ventricular location was shown to be superior during CRT, the right ventricular pacing site that produced the optimal acute hemodynamic response varied between patients. Acute hemodynamic effects of cardiac pacing are conventionally evaluated after stabilization episodes. The applied duration of stabilization periods in most cardiac pacing studies varied considerably. With an ad hoc protocol (Chapter 6.e.) and indexes of mechanical dyssynchrony derived by conductance catheter we demonstrated that the usage of stabilization periods during evaluation of cardiac pacing may mask early changes in systolic and diastolic intra-ventricular dyssynchrony. In fact, at the onset of ventricular pacing, the main dyssynchrony and ventricular performance changes occur within a 10s time span, initiated by the changes in ventricular mechanical dyssynchrony induced by aberrant conduction and followed by a partial or even complete recovery. It was already demonstrated in normal animals that ventricular mechanical dyssynchrony may act as a physiologic modulator of cardiac performance together with heart rate, contractile state, preload and afterload. The present observation, which shows the compensatory mechanism of mechanical dyssynchrony, suggests that ventricular dyssynchrony may be regarded as an intrinsic cardiac property, with baseline dyssynchrony at increased level in heart failure patients. To make available an independent system for cardiac output estimation, in order to confirm the results obtained with conductance volume method, we developed and validated a novel technique to apply the Modelflow method (a method that derives an aortic flow waveform from arterial pressure by simulation of a non-linear three-element aortic input impedance model, Wesseling et al. 1993) to the left ventricular pressure signal, instead of the arterial pressure used in the classical approach (Chapter 7.). The results confirmed that in patients without valve abnormalities, undergoing conductance catheter evaluations, the continuous monitoring of cardiac output using the intra-ventricular pressure signal is reliable. Thus, cardiac output can be monitored quantitatively and continuously with a simple and low-cost method. During this work, additional studies were carried out to investigate several areas of uncertainty of CRT. The results of these studies are briefly presented in Appendix: the long-term survival in patients treated with CRT in clinical practice, the effects of CRT in patients with mild symptoms of heart failure and in very old patients, the limited thoracotomy as a second choice alternative to transvenous implant for CRT delivery, the evolution and prognostic significance of diastolic filling pattern in CRT, the selection of candidates to CRT with echocardiographic criteria and the prediction of response to the therapy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The habenular nuclei are diencephalic structures present in Vertebrates and they form, with the associated fiber systems, a part of the system that connects the telencephalon to the ventral mesencephalon (Concha M. L. and Wilson S. W., 2001). In representative species of almost all classes of Vertebrates the habenular nuclei are asymmetric, both in terms of size and of neuronal and neurochemical organization, although different types of asymmetry follow different evolutionary courses. Previous studies have analyzed the spread and diversity of the asymmetry in species for which data are not clear (Kemali M. et al., 1980). Notwithstanding that, it’s still not totally understood the evolution of the phenomenon, and the ontogenetic mechanisms that have led to the habenular asymmetry development are not clear (Smeets W.J. et al., 1983). For the present study 14 species of Elasmobranchs and 15 species of Teleostean have been used. Brains removed from the animals have been fixed using 4% paraformaldehyde in phosphate buffer; brains have been analyzed with different tecniques, and I used histological, immunohistochemical and ultrastructural analysis to describe this asymmetry. My results confirm data previously obtained studying other Elasmobranchs species, in which the left habenula is larger than the right one; the Teleostean show some slightly differences regarding the size of the habenular ganglia, in some species, in which the left habenular nucleus is larger than the right. In the course of studies, a correlation between the habits of life and the diencephalic asymmetry seems to emerge: among the Teleostean analyzed, the species with benthic life (like Lepidorhombus boscii, Platichthys flesus, Solea vulgaris) seem to possess a slight asymmetry, analogous to the one of the Elasmobranchs, while in the other species (like Liza aurata, Anguilla anguilla, Trisopterus minutus) the habenulae are symmetrical. However, various aspects of the neuroanatomical asymmetries of the epithalamus have not been deepened in order to obtain a complete picture of the evolution of this phenomenon, and new searches are needed to examine the species without clear asymmetry, in order to understand the spread and the diversity of the asymmetry among the habenulae between the Vertebrates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Among various nanoparticles, noble metal nanoparticles have attracted considerable attention due to their optical, catalytic and conducting properties. This work has been focused on the development of an innovative method of synthesis for the preparation of metal nanosuspensions of Au, Ag, Cu, in order to achieve stable sols, showing suitable features to allow an industrial scale up of the processes. The research was developed in collaboration with a company interested in the large scale production of the studied nanosuspensions. In order to develop a commercial process, high solid concentration, long time colloidal stability and particle size control, are required. Two synthesis routes, differing by the used solvents, have been implemented: polyol based and water based synthesis. In order to achieve a process intensification the microwave heating has been applied. As a result, colloidal nanosuspensions with suitable dimensions, good optical properties, very high solid content and good stability, have been synthesized by simple and environmental friendly methods. Particularly, due to some interesting results an optimized synthesis process has been patented. Both water and polyol based synthesis, developed in the presence of a reducing agent and of a chelating polymer, allowed to obtain particle size-control and colloidal stability by tuning the different parameters. Furthermore, it has been verified that microwave device, due to its rapid and homogeneous heating, provides some advantages over conventional method. In order to optimize the final suspensions properties, for each synthesis it has been studied the effect of different parameters (temperature, time, precursors concentrations, etc) and throughout a specific optimization action a right control on nucleation and growth processes has been achieved. The achieved nanoparticles were confirmed by XRD analysis to be the desired metal phases, even at the lowest synthesis temperatures. The particles showed a diameter, measured by STEM and dynamic light scattering technique (DLS), ranging from 10 to 60 nm. Surface plasmon resonance (SPR) was monitored by UV-VIS spectroscopy confirming its dependence by nanoparticles size and shape. Moreover the reaction yield has been assessed by ICP analysis performed on the unreacted metal cations. Finally, thermal conductivity and antibacterial activity characterizations of copper and silver sols respectively are now ongoing in order to check their application as nanofluid in heat transfer processes and as antibacterial agent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Animal models have been relevant to study the molecular mechanisms of cancer and to develop new antitumor agents. Anyway, the huge divergence in mouse and human evolution made difficult the translation of the gained achievements in preclinical mouse based studies. The generation of clinically relevant murine models requires their humanization both concerning the creation of transgenic models and the generation of humanized mice in which to engraft a functional human immune system, and reproduce the physiological effects and molecular mechanisms of growth and metastasization of human tumors. In particular, the availability of genotypically stable immunodepressed mice able to accept tumor injection and allow human tumor growth and metastasization would be important to develop anti-tumor and anti-metastatic strategies. Recently, Rag2-/-;gammac-/- mice, double knockout for genes involved in lymphocyte differentiation, had been developed (CIEA, Central Institute for Experimental Animals, Kawasaki, Japan). Studies of human sarcoma metastasization in Rag2-/-; gammac-/- mice (lacking B, T and NK functionality) revealed their high metastatic efficiency and allowed the expression of human metastatic phenotypes not detectable in the conventionally used nude murine model. In vitro analysis to investigate the molecular mechanisms involved in the specific pattern of human sarcomas metastasization revealed the importance of liver-produced growth and motility factors, in particular the insulin-like growth factors (IGFs). The involvement of this growth factor was then demonstrated in vivo through inhibition of IGF signalling pathway. Due to the high growth and metastatic propensity of tumor cells, Rag2-/-;gammac-/- mice were used as model to investigate the metastatic behavior of rhabdomyosarcoma cells engineered to improve the differentiation. It has been recently shown that this immunodeficient model can be reconstituted with a human immune system through the injection of human cord blood progenitor cells. The work illustrated in this thesis revealed that the injection of different human progenitor cells (CD34+ or CD133+) showed peculiar engraftment and differentiation abilities. Experiments of cell vaccination were performed to investigate the functionality of the engrafted human immune system and the induction of specific human immune responses. Results from such experiments will allow to collect informations about human immune responses activated during cell vaccination and to define the best reconstitution and experimental conditions to create a humanized model in which to study, in a preclinical setting, immunological antitumor strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The city is a collection of built structures and infrastructure embedded in socio-cultural processes: any investigation into a city’s transformations involves considerations on the degree to which its composite elements respond to socio-economical changes. The main purpose of this research is to investigate how transformations in the functional requirements of New York’s society have spurred, since the 1970s, changes in both the city’s urban structure and physical form. The present work examines the rise of Amenity Zones in New York, and investigates the transformations that have occurred in New York’s built environment since the 1970s. By applying qualitative measures and analyzing the relationship between urban amenities and the creative class, the present work has investigated changes in the urban structure and detected a hierarchical series of amenity zones classes, namely, Super Amenity Zones (SAZs), Nodal Amenity Zones (NAZs) and Peripheral Amenity Zones (PAZs). This series allows for a more comprehensive reading of the urban structure in a complex city like New York, bringing advancements to the amenity zone’s methodology. In order to examine the manner in which the other component of the city, the physical form, has changed or adapted to the new socio-economic condition, the present research has applied Conzenian analysis to a select study area, Atlantic Avenue. The results of this analysis reveal that, contrary to the urban structure, which changes rapidly, the physical form of New York is hard to modify completely, due to the resilience of the town plan and its elements, and to preservation laws; the city rather adapts to socio-economical changes through process of adaptive reuses or conversion. Concluding, this research has examined the dialectic between the ever-changing needs of society and the complexity of the built environment and urban structure, showing the different degrees to which the urban landscape modifies, reacts and sometimes adapts to the population’s functional requirements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present PhD thesis was focused on the development and application of chemical methodology (Py-GC-MS) and data-processing method by multivariate data analysis (chemometrics). The chromatographic and mass spectrometric data obtained with this technique are particularly suitable to be interpreted by chemometric methods such as PCA (Principal Component Analysis) as regards data exploration and SIMCA (Soft Independent Models of Class Analogy) for the classification. As a first approach, some issues related to the field of cultural heritage were discussed with a particular attention to the differentiation of binders used in pictorial field. A marker of egg tempera the phosphoric acid esterified, a pyrolysis product of lecithin, was determined using HMDS (hexamethyldisilazane) rather than the TMAH (tetramethylammonium hydroxide) as a derivatizing reagent. The validity of analytical pyrolysis as tool to characterize and classify different types of bacteria was verified. The FAMEs chromatographic profiles represent an important tool for the bacterial identification. Because of the complexity of the chromatograms, it was possible to characterize the bacteria only according to their genus, while the differentiation at the species level has been achieved by means of chemometric analysis. To perform this study, normalized areas peaks relevant to fatty acids were taken into account. Chemometric methods were applied to experimental datasets. The obtained results demonstrate the effectiveness of analytical pyrolysis and chemometric analysis for the rapid characterization of bacterial species. Application to a samples of bacterial (Pseudomonas Mendocina), fungal (Pleorotus ostreatus) and mixed- biofilms was also performed. A comparison with the chromatographic profiles established the possibility to: • Differentiate the bacterial and fungal biofilms according to the (FAMEs) profile. • Characterize the fungal biofilm by means the typical pattern of pyrolytic fragments derived from saccharides present in the cell wall. • Individuate the markers of bacterial and fungal biofilm in the same mixed-biofilm sample.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

B:Glioblastoma multiforme(GBM) is one of the most prevalent and aggressive malignant primary brain tumors in adult patients. 64CuCl2 is an innovative radiopharmaceutical investigated as theranostic agent in GBM patients. The therapeutic scheme is still under evaluation, therefore the research focused on the possibility of radioresistance development. The actors responsible for modulating radioresistance could be miRNAs, thus their potential use was investigated both in radioresistant cell lines and in GBM patients plasma samples. M:Radioresistant cell lines were generated by exposing U87MG, U373MG lines to increasing doses of radiation for 32 weeks. Cell membrane permeability alterations and DNA damage were assessed to characterize the lines. Moreover, 64Cu cell incorporation and subcellular distribution were investigated measuring gamma-radiation emission. miRNA expression was evaluated: in parental and radioresistant cell lines, both in cell pellet and media exosomes; in plasma samples of GBM patients using TaqMan Array MicroRNA Cards. R:Radioresistant lines exhibited reduction in membrane permeability and in DNA DSBs indicating the capability to skip the drug killing effect. Cell uptake assays showed internalization of 64Cu both in the sensitive and radioresistant lines. Radioresistant lines showed a different miRNA expression profile compared to the parental lines. 5 miRNAs were selected as possible biomarkers of response to treatment (miR-339-3p, miR-133b, miR-103a-3p, miR-32-5p, miR-335-5p) and 6 miRNAs as possible predictive biomarkers of response to treatment (let-7e-5p, miR-15a-5p, miR-29c-3p, miR-495, miR-146b-5p, miR-199a-5p). miR-32-5p was selected as possible molecule to be used to restore 64CuCl2 responsiveness in the radioresistant cell lines. C: This is the first study describing the development and characterization of 64CuCl2 radioresistant cell lines useful to implement the approach for dosimetric analysis to avoid radioresistance uprising. miRNAs could bring to a better understanding of 64CuCl2 treatment, becoming a useful tool both in detection of treatment response and both as molecule that could restore responsiveness to 64CuCl2 treatment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dynamics and geometry of the material inflowing and outflowing close to the supermassive black hole in active galactic nuclei are still uncertain. X-rays are the most suitable way to study the AGN innermost regions because of the Fe Kα emission line, a proxy of accretion, and Fe absorption lines produced by outflows. Winds are typically classified as Warm Absorbers (slow and mildly ionized) and Ultra Fast Outflows (fast and highly ionized). Transient Obscurers -optically thick winds that produce strong spectral hardening in X-rays, lasting from days to months- have been observed recently. Emission and absorption features vary on time-scales from hours to years, probing phenomena at different distances from the SMBH. In this work, we use time-resolved spectral analysis to investigate the accretion and ejection flows, to characterize them individually and search for correlations. We analyzed XMM-Newtomn data of a set of the brightest Seyfert 1 galaxies that went through an obscuration event: NGC 3783, NGC 3227, NGC 5548, and NGC 985. Our aim is to search for emission/absorption lines in short-duration spectra (∼ 10ks), to explore regions as close as the SMBH as the statistics allows for, and possibly catch transient phenomena. First we run a blind search to detect emission/absorption features, then we analyze their evolution with Residual Maps: we visualize simultaneously positive and negative residuals from the continuum in the time-energy plane, looking for patterns and relative time-scales. In NGC 3783 we were able to ascribe variations of the Fe Kα emission line to absorptions at the same energy due to clumps in the obscurer, whose presence is detected at >3σ, and to determine the size of the clumps. In NGC 3227 we detected a wind at ∼ 0.2c at ∼ 2σ, briefly appearing during an obscuration event.