12 resultados para Relevance ranking

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

20.00% 20.00%

Publicador:

Resumo:

La tesi si occupa della teoria delle ranking functions di W. Spohn, dottrina epistemologica il cui fine è dare una veste logica rigorosa ai concetti di causalità, legge di natura, spiegazione scientifica, a partire dalla nozione di credenza. Di tale teoria manca ancora una esposizione organica e unitaria e, soprattutto, formulata in linguaggio immediatamente accessibile. Nel mio lavoro, che si presenta come introduzione ad essa, è anche messa a raffronto con le teorie che maggiormente l’hanno influenzata o rispetto alle quali si pone come avversaria. Il PRIMO CAPITOLO si concentra sulla teoria di P. Gärdenfors, il più diretto predecessore e ispiratore di Spohn. Questo consente al lettore di acquisire familiarità con le nozioni di base della logica epistemica. La conoscenza, nella teoria del filosofo svedese, è concepita come processo di acquisizione ed espulsione di credenze, identificate con proposizioni, da un insieme. I tre maggiori fenomeni epistemici sono l’espansione, la revisione e la contrazione. Nel primo caso si immagazzina una proposizione in precedenza sconosciuta, nel secondo se ne espelle una a causa dell’acquisizione della sua contraddittoria, nel terzo si cancella una proposizione per amore di ipotesi e si investigano le conseguenze di tale cancellazione. Controparte linguistica di quest’ultimo fenomeno è la formulazione di un condizionale controfattuale. L’epistemologo, così come Gärdenfors concepisce il suo compito, è fondamentalmente un logico che deve specificare funzioni: vale a dire, le regole che deve rispettare ciascun passaggio da un insieme epistemico a un successivo per via di espansione, revisione e contrazione. Il SECONDO CAPITOLO tratta infine della teoria di Spohn, cercando di esporla in modo esauriente ma anche molto semplice. Anche in Spohn evidentemente il concetto fondamentale è quello di funzione: si tratta però in questo caso di quella regola di giudizio soggettivo che, a ciascuna credenza, identificata con una proposizione, associa un grado (un rank), espresso da un numero naturale positivo o dallo zero. Un rank è un grado di non-credenza (disbelief). Perché la non-credenza (che comporta un notevole appesantimento concettuale)? Perché le leggi della credenza così concepite presentano quella che Spohn chiama una “pervasiva analogia” rispetto alle leggi della probabilità (Spohn la chiama persino “armonia prestabilita” ed è un campo su cui sta ancora lavorando). Essenziale è il concetto di condizionalizzazione (analogo a quello di probabilità condizionale): a una credenza si associa un rank sulla base di (almeno) un’altra credenza. Grazie a tale concetto Spohn può formalizzare un fenomeno che a Gärdenfors sfugge, ossia la presenza di correlazioni interdoxastiche tra le credenze di un soggetto. Nella logica epistemica del predecessore, infatti, tutto si riduce all’inclusione o meno di una proposizione nell’insieme, non si considerano né gradi di credenza né l’idea che una credenza sia creduta sulla base di un’altra. Il TERZO CAPITOLO passa alla teoria della causalità di Spohn. Anche questa nozione è affrontata in prospettiva epistemica. Non ha senso, secondo Spohn, chiedersi quali siano i tratti “reali” della causalità “nel mondo”, occorre invece studiare che cosa accade quando si crede che tra due fatti o eventi sussista una correlazione causale. Anche quest’ultima è fatta oggetto di una formalizzazione logica rigorosa (e diversificata, infatti Spohn riconosce vari tipi di causa). Una causa “innalza lo status epistemico” dell’effetto: vale a dire, quest’ultimo è creduto con rank maggiore (ossia minore, se ci si concentra sulla non-credenza) se condizionalizzato sulla causa. Nello stesso capitolo espongo la teoria della causalità di Gärdenfors, che però è meno articolata e minata da alcuni errori. Il QUARTO CAPITOLO è tutto dedicato a David Lewis e alla sua teoria controfattuale della causalità, che è il maggiore avversario tanto di Spohn quanto di Gärdenfors. Secondo Lewis la migliore definizione di causa può essere data in termini controfattuali: la causa è un evento tale che, se non fosse accaduto, nemmeno l’effetto sarebbe accaduto. Naturalmente questo lo obbliga a specificare una teoria delle condizioni di verità di tale classe di enunciati che, andando contro i fatti per definizione, non possono essere paragonati alla realtà. Lewis ricorre allora alla dottrina dei mondi possibili e della loro somiglianza comparativa, concludendo che un controfattuale è vero se il mondo possibile in cui il suo antecedente e il suo conseguente sono veri è più simile al mondo attuale del controfattuale in cui il suo antecedente è vero e il conseguente è falso. Il QUINTO CAPITOLO mette a confronto la teoria di Lewis con quelle di Spohn e Gärdenfors. Quest’ultimo riduce i controfattuali a un fenomeno linguistico che segnala il processo epistemico di contrazione, trattato nel primo capitolo, rifiutando così completamente la dottrina dei mondi possibili. Spohn non affronta direttamente i controfattuali (in quanto a suo dire sovraccarichi di sottigliezze linguistiche di cui non vuole occuparsi – ha solo un abbozzo di teoria dei condizionali) ma dimostra che la sua teoria è superiore a quella di Lewis perché riesce a rendere conto, con estrema esattezza, di casi problematici di causalità che sfuggono alla formulazione controfattuale. Si tratta di quei casi in cui sono in gioco, rafforzandosi a vicenda o “concorrendo” allo stesso effetto, più fattori causali (casi noti nella letteratura come preemption, trumping etc.). Spohn riesce a renderne conto proprio perché ha a disposizione i rank numerici, che consentono un’analisi secondo cui a ciascun fattore causale è assegnato un preciso ruolo quantitativamente espresso, mentre la dottrina controfattuale è incapace di operare simili distinzioni (un controfattuale infatti è vero o falso, senza gradazioni). Il SESTO CAPITOLO si concentra sui modelli di spiegazione scientifica di Hempel e Salmon, e sulla nozione di causalità sviluppata da quest’ultimo, mettendo in luce soprattutto il ruolo (problematico) delle leggi di natura e degli enunciati controfattuali (a questo proposito sono prese in considerazione anche le famose critiche di Goodman e Chisholm). Proprio dalla riflessione su questi modelli infatti è scaturita la teoria di Gärdenfors, e tanto la dottrina del filosofo svedese quanto quella di Spohn possono essere viste come finalizzate a rendere conto della spiegazione scientifica confrontandosi con questi modelli meno recenti. Il SETTIMO CAPITOLO si concentra sull’analisi che la logica epistemica fornisce delle leggi di natura, che nel capitolo precedente sono ovviamente emerse come elemento fondamentale della spiegazione scientifica. Secondo Spohn le leggi sono innanzitutto proposizioni generali affermative, che sono credute in modo speciale. In primo luogo sono credute persistentemente, vale a dire, non sono mai messe in dubbio (tanto che se si incappa in una loro contro-istanza si va alla ricerca di una violazione della normalità che la giustifichi). In secondo luogo, guidano e fondano la credenza in altre credenze specifiche, che sono su di esse condizionalizzate (si riprendono, con nuovo rigore logico, le vecchie idee di Wittgenstein e di Ramsey e il concetto di legge come inference ticket). In terzo luogo sono generalizzazioni ricavate induttivamente: sono oggettivazioni di schemi induttivi. Questo capitolo si sofferma anche sulla teoria di legge offerta da Gärdenfors (analoga ma embrionale) e sull’analisi che Spohn fornisce della nozione di clausola ceteris paribus. L’OTTAVO CAPITOLO termina l’analisi cominciata con il sesto, considerando finalmente il modello epistemico della spiegazione scientifica. Si comincia dal modello di Gärdenfors, che si mostra essere minato da alcuni errori o comunque caratterizzato in modo non sufficientemente chiaro (soprattutto perché non fa ricorso, stranamente, al concetto di legge). Segue il modello di Spohn; secondo Spohn le spiegazioni scientifiche sono caratterizzate dal fatto che forniscono (o sono finalizzate a fornire) ragioni stabili, vale a dire, riconducono determinati fenomeni alle loro cause e tali cause sono credute in modo persistente. Con una dimostrazione logica molto dettagliata e di acutezza sorprendente Spohn argomenta che simili ragioni, nel lungo periodo, devono essere incontrate. La sua quindi non è solo una teoria della spiegazione scientifica che elabori un modello epistemico di che cosa succede quando un fenomeno è spiegato, ma anche una teoria dello sviluppo della scienza in generale, che incoraggia a perseguire la ricerca delle cause come necessariamente coronata da successo. Le OSSERVAZIONI CONCLUSIVE fanno il punto sulle teorie esposte e sul loro raffronto. E’ riconosciuta la superiorità della teoria di Spohn, di cui si mostra anche che raccoglie in pieno l’eredità costruttiva di Hume, al quale gli avversari si rifanno costantemente ma in modo frammentario. Si analizzano poi gli elementi delle teorie di Hempel e Salmon che hanno precorso l’impostazione epistemica. La teoria di Spohn non è esente però da alcuni punti ancora problematici. Innanzitutto, il ruolo della verità; in un primo tempo Spohn sembra rinunciare, come fa esplicitamente il suo predecessore, a trattare la verità, salvo poi invocarla quando si pone il grave problema dell’oggettivazione delle ranking functions (il problema si presenta poiché di esse inizialmente si dice che sono regole soggettive di giudizio e poi si identificano in parte con le leggi di natura). C’è poi la dottrina dei gradi di credenza che Spohn dice presentarsi “unitamente alle proposizioni” e che costituisce un inutile distacco dal realismo psicologico (critica consueta alla teoria): basterebbe osservare che i gradi di credenza sono ricavati o per condizionalizzazione automatica sulla base del tipo di fonte da cui una proposizione proviene, o per paragone immaginario con altre fonti (la maggiore o minore credenza infatti è un concetto relazionale: si crede di più o di meno “sulla base di…” o “rispetto a…”). Anche la trattazione delle leggi di natura è problematica; Spohn sostiene che sono ranking functions: a mio avviso invece esse concorrono a regole di giudizio, che prescrivono di impiegare le leggi stesse per valutare proposizioni o aspettative. Una legge di natura è un ingranaggio, per così dire, di una valutazione di certezza ma non si identifica totalmente con una legge di giudizio. I tre criteri che Spohn individua per distinguere le leggi poi non sono rispettati da tutte e sole le leggi stesse: la generalizzazione induttiva può anche dare adito a pregiudizi, e non di tutte le leggi si sono viste, individualmente, istanze ripetute tanto da giustificarle induttivamente. Infine, un episodio reale di storia della scienza come la scoperta della sintesi dell’urea da parte di F. Wöhler (1828 – ottenendo carbammide, organico, da due sostanze inorganiche, dimostra che non è vera la legge di natura fini a quel momento presunta tale secondo cui “sostanze organiche non possono essere ricavate da sostanze inorganiche”) è indice che le leggi di natura non sono sempre credute in modo persistente, cosicché per comprendere il momento della scoperta è pur sempre necessario rifarsi a una teoria di tipo popperiano, rispetto alla quale Spohn presenta invece la propria in assoluta antitesi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis evaluated in vivo and in vitro enamel permeability in different physiological and clinical conditions by means of SEM inspection of replicas of enamel surface obtained from polyvinyl siloxane impressions subsequently later cast in polyether impression ma-terial. This technique, not invasive and risk-free, allows the evaluation of fluid outflow from enamel surface and is able to detect the presence of small quantities of fluid, visu-alized as droplets. Fluid outflow on enamel surface represents enamel permeability. This property has a paramount importance in enamel physiolgy and pathology although its ef-fective role in adhesion, caries pathogenesis and prevention today is still not fully under-stood. The aim of the studies proposed was to evaluate enamel permeability changes in differ-ent conditions and to correlate the findings with the actual knowledge about enamel physiology, caries pathogenesis, fluoride and etchinhg treatments. To obtain confirmed data the replica technique has been supported by others specific techniques such as Ra-man and IR spectroscopy and EDX analysis. The first study carried out visualized fluid movement through dental enamel in vivo con-firmed that enamel is a permeable substrate and demonstrated that age and enamel per-meability are closely related. Examined samples from subjects of different ages showed a decreasing number and size of droplets with increasing age: freshly erupted permanent teeth showed many droplets covering the entire enamel surface. Droplets in permanent teeth were prominent along enamel perikymata. These results obtained through SEM inspection of replicas allowed innovative remarks in enamel physiology. An analogous testing has been developed for evaluation of enamel permeability in primary enamel. The results of this second study showed that primary enamel revealed a substantive permeability with droplets covering the entire enamel sur-face without any specific localization accordingly with histological features, without changes during aging signs of post-eruptive maturation. These results confirmed clinical data that showed a higher caries susceptibility for primary enamel and suggested a strong relationship between this one and enamel permeability. Topical fluoride application represents the gold standard for caries prevention although the mechanism of cariostatic effect of fluoride still needs to be clarified. The effects of topical fluoride application on enamel permeability were evaluated. Particularly two dif-ferent treatments (NaF and APF), with different pH, were examined. The major product of topical fluoride application was the deposition of CaF2-like globules. Replicas inspec-tion before and after both treatments at different times intervals and after specific addi-tional clinical interventions showed that such globule formed in vivo could be removed by professional toothbrushing, sonically and chemically by KOH. The results obtained in relation to enamel permeability showed that fluoride treatments temporarily reduced enamel water permeability when CaF2-like globules were removed. The in vivo perma-nence of decreased enamel permeability after CaF2 globules removal has been demon-strated for 1 h for NaF treated teeth and for at least 7 days for APF treated teeth. Important clinical consideration moved from these results. In fact the caries-preventing action of fluoride application may be due, in part, to its ability to decrease enamel water permeability and CaF2 like-globules seem to be indirectly involved in enamel protection over time maintaining low permeability. Others results obtained by metallographic microscope and SEM/EDX analyses of or-thodontic resins fluoride releasing and not demonstrated the relevance of topical fluo-ride application in decreasing the demineralization marks and modifying the chemical composition of the enamel in the treated area. These data obtained in both the experiments confirmed the efficacy of fluoride in caries prevention and contribute to clarify its mechanism of action. Adhesive dentistry is the gold standard for caries treatment and tooth rehabilitation and is founded on important chemical and physical principles involving both enamel and dentine substrates. Particularly acid etching of dental enamel enamel has usually employed in bonding pro-cedures increasing microscopic roughness. Different acids have been tested in the litera-ture suggesting several etching procedures. The acid-induced structural transformations in enamel after different etching treatments by means of Raman and IR spectroscopy analysis were evaluated and these findings were correlated with enamel permeability. Conventional etching with 37% phosphoric acid gel (H3PO4) for 30 s and etching with 15 % HCl for 120 s were investigated. Raman and IR spectroscopy showed that the treatment with both hydrochloric and phosphoric acids induced a decrease in the carbonate content of the enamel apatite. At the same time, both acids induced the formation of HPO42- ions. After H3PO4 treatment the bands due to the organic component of enamel decreased in intensity, while in-creased after HCl treatment. Replicas of H3PO4 treated enamel showed a strongly reduced permeability while replicas of HCl 15% treated samples showed a maintained permeability. A decrease of the enamel organic component, as resulted after H3PO4 treatment, involves a decrease in enamel permeability, while the increase of the organic matter (achieved by HCl treat-ment) still maintains enamel permeability. These results suggested a correlation between the amount of the organic matter, enamel permeability and caries. The results of the different studies carried out in this thesis contributed to clarify and improve the knowledge about enamel properties with important rebounds in theoretical and clinical aspects of Dentistry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inflammation is thought to contribute to the pathogenesis of neurodegenerative diseases. Among the resident population of cells in the brain, astroglia have been suggested to actively participate in the induction and regulation of neuroinflammation by controlling the secretion of local mediators. However, the initial cellular mechanisms by which astrocytes react to pro-inflammatory molecules are still unclear. Our study identified mitochondria as highly sensitive organelles that rapidly respond to inflammatory stimuli. Time-lapse video microscopy revealed that mitochondrial morphology, dynamics and motility are drastically altered upon inflammation, resulting in perinuclear clustering of mitochondria. These mitochondrial rearrangements are accompanied by an increased formation of reactive oxygen species and a recruitment of autophagic vacuoles. 24 to 48 hours after the acute inflammatory stimulus, however, the mitochondrial network is re-established. Strikingly, the recovery of a tubular mitochondrial network is abolished in astrocytes with a defective autophagic response, indicating that activation of autophagy is required to restore mitochondrial dynamics. By employing co-cultivation assays we observed that primary cortical neurons undergo degeneration in the presence of inflamed astrocytes. However, this effect was not observed when the primary neurons were grown in conditioned medium derived from inflamed astrocytes, suggesting that a direct contact between astrocytes and neurons mediates neuronal dysfunction upon inflammation. Our results suggest that astrocytes react to inflammatory stimuli by transiently rearranging their mitochondria, a process that involves the autophagic machinery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this PhD thesis 3 projects were addressed focusing on the melanopsin retinal ganglion cells (mRGCs) system and its relevance for circadian rhythms and sleep in neurodegeneration. The first project was aimed at completing the characterization of mRGCs system in hereditary optic neuropathies (LHON and DOA). We confirmed that mRGCs are relatively spared also in post-mortem retinal specimens of a DOA case and pupillometric evaluation of LHON patients showed preservation of the pupillary light reflex, with attenuated responses compared to controls. Cell studies failed to indicate a protective role exerted by melanopsin itself. The second project was aimed at characterizing the possible occurrence of optic neuropathy and rest-activity circadian rhythm dysfunction in Alzheimer (AD) and Parkinson disease (PD), as well as, at histological level, the possible involvement of mRGCs in AD. OCT studies demonstrated a subclinical optic neuropathy in both AD and PD patients, with a different pattern involving the superior and nasal quadrants in AD and the temporal quadrant in PD. Actigraphic studies demonstrated a tendency towards an increased intradaily variability (IV) and reduced relative amplitude (RA) of rest-activity circadian rhythm in AD and a significant increased IV a reduced RA in PD. Immunohistochemical analysis of post-mortem retinal specimens and optic nerve cross-sections of neuropathologically confirmed AD cases demonstrated a significant loss of mRGCs and a nearly significant loss of axons in AD compared to controls. The mRGCs were affected in AD independently from age and magnitude of axonal loss. Overall these results suggest a role of the mRGCs system in the pathogenesis of circadian dysfunction in AD. The third project was aimed at evaluating the possible association between a single nucleotide polymorphism of the OPN4 gene and chronotype or SAD, failing to find any significant association with chronotype, but showing a non-significant increment of TT genotype in SAD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One important metaphor, referred to biological theories, used to investigate on organizational and business strategy issues is the metaphor about heredity; an area requiring further investigation is the extent to which the characteristics of blueprints inherited from the parent, helps in explaining subsequent development of the spawned ventures. In order to shed a light on the tension between inherited patterns and the new trajectory that may characterize spawned ventures’ development we propose a model aimed at investigating which blueprints elements might exert an effect on business model design choices and to which extent their persistence (or abandonment) determines subsequent business model innovation. Under the assumption that academic and corporate institutions transmit different genes to their spin-offs, we hence expect to have heterogeneity in elements that affect business model design choices and its subsequent evolution. This is the reason why we carry on a twofold analysis in the biotech (meta)industry: under a multiple-case research design, business model and especially its fundamental design elements and themes scholars individuated to decompose the construct, have been thoroughly analysed. Our purpose is to isolate the dimensions of business model that may have been the object of legacy and the ones along which an experimentation and learning process is more likely to happen, bearing in mind that differences between academic and corporate might not be that evident as expected, especially considering that business model innovation may occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of this work concerns nonparametric permutation-based methods aiming to find a ranking (stochastic ordering) of a given set of groups (populations), gathering together information from multiple variables under more than one experimental designs. The problem of ranking populations arises in several fields of science from the need of comparing G>2 given groups or treatments when the main goal is to find an order while taking into account several aspects. As it can be imagined, this problem is not only of theoretical interest but it also has a recognised relevance in several fields, such as industrial experiments or behavioural sciences, and this is reflected by the vast literature on the topic, although sometimes the problem is associated with different keywords such as: "stochastic ordering", "ranking", "construction of composite indices" etc., or even "ranking probabilities" outside of the strictly-speaking statistical literature. The properties of the proposed method are empirically evaluated by means of an extensive simulation study, where several aspects of interest are let to vary within a reasonable practical range. These aspects comprise: sample size, number of variables, number of groups, and distribution of noise/error. The flexibility of the approach lies mainly in the several available choices for the test-statistic and in the different types of experimental design that can be analysed. This render the method able to be tailored to the specific problem and the to nature of the data at hand. To perform the analyses an R package called SOUP (Stochastic Ordering Using Permutations) has been written and it is available on CRAN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. Despite work-related stress is one of the most studied topic in organizational psychology, many aspects as for example the use of different measures (e.g. subjective and objective, qualitative and quantitative) are still under debate. According to this, in order to enhance knowledge concerning which factors and processes contribute to create healthy workplaces, this thesis is composed by four different studies aiming to understand: a) the role of relevant antecedents (e.g. leadership, job demands, work-family conflict, social support etc.) and outcomes (e.g. workplace phobia, absenteeism etc.) of work-related stress; and b) how to manage psychosocial risk factors in the workplace. The studies. The first study focused on how disagreement between supervisors and their employees on leadership style (transformational and transactional) could affect workers well-being and work team variables. The second and third study used both subjective and objective data in order to increase the quality of the reliability of the results gained. Particularly, the second study focused on job demand and its relationship with objective sickness leave. Findings showed that despite there is no direct relationship between these two variables, job demand affects work-family conflict, which in turn affect exhaustion, which leads to absenteeism. The third study analysed the role of a new concept never studied before in organizational settings (workplace phobia), as a health outcome in the JD-R model, demonstrating also its relationship with absenteeism. The last study highlighted the added value of using the mixed methods research approach in order to detect and analyse context-specific job demands which could affects workers’ health. Conclusion. The findings of this thesis answered both to open questions in the scientific literature and to the social request of managing psychosocial risk factors in the workplace in order to enhance workers well-being.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis two approaches were applied to achieve a double general objective. The first chapter was dedicated to the study of the distribution of the expression of genes of several bitter and fat receptor in several gastrointestinal tracts. A set of 7 genes for bitter taste and for 3 genes for fat taste was amplified with real-time PCR from mRNA extracted from 5 gastrointestinal segments of weaned pigs. The presence of gene expression for several chemosensing receptors for bitter and fat taste in different compartments of the stomach confirms that this organ should be considered a player for the early detection of bolus composition. In the second chapter we investigated in young pigs the distribution of butyrate-sensing olfactory receptor (OR51E1) receptor along the GIT, its relation with some endocrine markers, its variation with age, and after interventions affecting the gut environment and intestinal microbiota in piglets and in different tissues. Our results indicate that OR51E1 is strictly related to the normal GIT enteroendocrine activity. In the third chapter we investigated the differential gene expression between oxyntic and pyloric mucosa in seven starter pigs. The obtained data indicate that there is significant differential gene exression between oxintic of the young pig and pyloric mucosa and further functional studies are needed to confirm their physiological importance. In the last chapter, thymol, that has been proposed as an oral alternative to antibiotics in the feed of pigs and broilers, was introduced directly into the stomach of 8 weaned pigs and sampled for gastric oxyntic and pyloric mucosa. The analysis of the whole transcript expression shoes that the stimulation of gastric proliferative activity and the control of digestive activity by thymol can influence positively gastric maturation and function in the weaned pigs.