966 resultados para Two-point boundary value problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Triglycerides are reacted in a liquid phase reaction with methanol and a homogeneous basic catalyst. The reaction yields a spatially separated two phase result with an upper located non-polar phase consisting principally of non-polar methyl esters and a lower located phase consisting principally of glycerol and residual methyl esters. The glycerol phase is passed through a strong cationic ion exchanger to remove anions, resulting in a neutral product which is flashed to remove methanol and which is reacted with isobutylene in the presence of a strong acid catalyst to produce glycerol ethers. The glycerol ethers are then added back to the upper located methyl ethyl ester phase to provide an improved biodiesel fuel.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The problem of rats in our Hawaiian sugar cane fields has been with us for a long time. Early records tell of heavy damage at various times on all the islands where sugar cane is grown. Many methods were tried to control these rats. Trapping was once used as a control measure, a bounty was used for a time, gangs of dogs were trained to catch the rats as the cane was harvested. Many kinds of baits and poisons were used. All of these methods were of some value as long as labor was cheap. Our present day problem started when the labor costs started up and the sugar industry shifted to long cropping. Until World War II cane was an annual crop. After the war it was shifted to a two year crop, three years in some places. Depending on variety, location, and soil we raise 90 to 130 tons of sugar cane per acre, which produces 7 to 15 tons of sugar per acre for a two year crop. This sugar brings about $135 dollars per ton. This tonnage of cane is a thick tangle of vegetation. The cane grows erect for almost a year, as it continues to grow it bends over at the base. This allows the stalk to rest on the ground or on other stalks of cane as it continues to grow. These stalks form a tangled mat of stalks and dead leaves that may be two feet thick at the time of harvest. At the same time the leafy growing portion of the stalk will be sticking up out of the mat of cane ten feet in the air. Some of these individual stalks may be 30 feet long and still growing at the time of harvest. All this makes it very hard to get through a cane field as it is one long, prolonged stumble over and through the cane. It is in this mat of cane that our three species of rats live. Two species are familiar to most people in the pest control field. Rattus norvegicus and Rattus rattus. In the latter species we include both the black rat and the alexandrine rats, their habits seem to be the same in Hawaii. Our third rat is the Polynesian rat, Rattus exlans, locally called the Hawaiian rat. This is a small rat, the average length head to tip of tail is nine inches and the average body weight is 65 grams. It has dark brownish fur like the alexandrine rats, and a grey belly. It is found in Indonesia, on most of the islands of Oceania and in New Zealand. All three rats live in our cane fields and the brushy and forested portions of our islands. The norway and alexandrine rats are found in and around the villages and farms, the Polynesian rat is only found in the fields and waste areas. The actual amount of damage done by rats is small, but destruction they cause is large. The rats gnaw through the rind of the cane stalk and eat the soft juicy and sweet tissues inside. They will hollow out one to several nodes per stalk attacked. The effect to the cane stalk is like ringing a tree. After this attack the stalk above the chewed portion usually dies, and sometimes the lower portion too. If the rat does not eat through the stalk the cane stalk could go on living and producing sugar at a reduced rate. Generally an injured stalk does not last long. Disease and souring organisms get in the injury and kill the stalk. And if this isn't enough, some insects are attracted to the injured stalk and will sometimes bore in and kill it. An injured stalk of cane doesn't have much of a chance. A rat may only gnaw out six inches of a 30 foot stalk and the whole stalk will die. If the rat only destroyed what he ate we could ignore them but they cause the death of too much cane. This dead, dying, and souring cane cause several direct and indirect tosses. First we lose the sugar that the cane would have produced. We harvest all of our cane mechanically so we haul the dead and souring cane to the mill where we have to grind it with our good cane and the bad cane reduces the purity of the sugar juices we squeeze from the cane. Rats reduce our income and run up our overhead.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Our chairman has wisely asked that we not spend all of our time here telling each other about our bird problems. In the Southeast, our difficulties with blackbirds are based upon the same bird habits that cause trouble elsewhere: they flock, they roost and they eat, generally taking advantage of the readily available handouts that today's agricul¬tural practices provide. Those of us on the receiving end of these de¬predations of course think that damage in our own particular area must be far the worst, anywhere. Because of the location of our meeting place today, perhaps it is worthwhile to point out that a report prepared by our Bureau's Washington office this year outlined the problem of blackbird damage to corn in the Middle Atlantic States, the Great Lakes Region and in Florida, and then followed with this statement--"An equally serious problem occurs in rice and grain sorghum fields of Arkansas, Mississippi, Texas and Louisiana." The report also men¬tions that the largest winter concentrations of blackbirds are found in the lower Mississippi Valley. Our 1963-64 blackbird-starling survey showed 43 principal roosts totaling approximately 100 million of these birds in Virginia, the Carolinas, Georgia, Alabama, Tennessee and Kentucky. We have our own birds during the summer plus the "tourist" birds from up here and elsewhere during the winter, and all of these birds must eat, so suffice it to say that we, too, have some bird problems in the Southeast. I'm sure you're more interested in what we're doing about them. To keep this in perspective also, please bear in mind that against the magnitude of these problems, our blackbird control research staff at Gainesville consists of 3 biologists, 1 biochemist and one technician. And unfortunately, none of us happens to be a miracle worker. I think, though, we have made great progress toward solving the bird problems in the Southeast for the man-hours that have been expended in this re¬search. My only suggestion to those who are impatient about not having more answers is that they examine the budget that has been set up for this work. Only then could we intelligently discuss what might be expected as a reasonable rate of research progress. When I think about what we have accomplished in a short span of time, with very small expenditure, I can assure you that I am very proud of our small research crew at Gainesville--and I say this quite sincerely. At the Gainesville station, we work under two general research approaches to the bird damage problem. These projects have been assigned to us. The first is research on management of birds, particularly blackbirds and starlings destructive to crops or in feedlots, and, secondly, the development and the adaptation of those chemical compounds found to be toxic to birds but relatively safe to mammals. These approaches both require laboratory and field work that is further subdivided into several specific research projects. Without describing the details of these now, I want to mention some of our recent results. From the results, I'm sure you will gather the general objectives and some of the procedures used.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been shown that the vertical structure of the Brazil Current (BC)-Intermediate Western Boundary Current (IWBC) System is dominated by the first baroclinic mode at 22 degrees S-23 degrees S. In this work, we employed the Miami Isopycnic Coordinate Ocean Model to investigate whether the rich mesoscale activity of this current system, between 20 degrees S and 28 degrees S, is reproduced by a two-layer approximation of its vertical structure. The model results showed cyclonic and anticyclonic meanders propagating southwestward along the current axis, resembling the dynamical pattern of Rossby waves superposed on a mean flow. Analysis of the upper layer zonal velocity component, using a space-time diagram, revealed a dominant wavelength of about 450 km and phase velocity of about 0.20 ms(-1) southwestward. The results also showed that the eddy-like structures slowly grew in amplitude as they moved downstream. Despite the simplified design of the numerical experiments conducted here, these results compared favorably with observations and seem to indicate that weakly unstable long baroclinic waves are responsible for most of the variability observed in the BC-IWBC system. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We investigate the interface dynamics of the two-dimensional stochastic Ising model in an external field under helicoidal boundary conditions. At sufficiently low temperatures and fields, the dynamics of the interface is described by an exactly solvable high-spin asymmetric quantum Hamiltonian that is the infinitesimal generator of the zero range process. Generally, the critical dynamics of the interface fluctuations is in the Kardar-Parisi-Zhang universality class of critical behavior. We remark that a whole family of RSOS interface models similar to the Ising interface model investigated here can be described by exactly solvable restricted high-spin quantum XXZ-type Hamiltonians. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The transport properties of the two-dimensional system in HgTe-based quantum wells containing simultaneously electrons and holes of low densities are examined. The Hall resistance, as a function of perpendicular magnetic field, reveals an unconventional behavior, different from the classical N-shaped dependence typical for bipolar systems with electron-hole asymmetry. The quantum features of magnetotransport are explained by means of numerical calculation of the Landau level spectrum based on the Kane Hamiltonian. The origin of the quantum Hall plateau sigma(xy) = 0 near the charge neutrality point is attributed to special features of Landau quantization in our system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nonlocal resistance is studied in a two-dimensional system with a simultaneous presence of electrons and holes in a 20 nm HgTe quantum well. A large nonlocal electric response is found near the charge neutrality point in the presence of a perpendicular magnetic field. We attribute the observed nonlocality to the edge state transport via counterpropagating chiral modes similar to the quantum spin Hall effect at a zero magnetic field and graphene near a Landau filling factor nu = 0.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: to determine the ability of the reduced form of a screening instrument, the Patient Health Questionnaire-2 (PHQ-2), to assess the presence of depressive disorders in patients admitted to a general hospital. Method: A sample of 227 patients admitted to the clinical wards of a Brazilian general university hospital were assessed with Module A of the Diagnostic Structured Interview for the DSM-IV (SCID-IV) and filled out the PHQ-9 and PHQ-2. Results: The PHQ-2 demonstrated an area under the ROC curve of 0.89 (p < 0.0001), with a cutoff point of three or more being the one that best equilibrated the sensitivity (0.86) and specificity (0.75) values. The agreement index between the PHQ-2 and module A of SCID-W was 78.4% and the Kappa value was 0.51. Regarding reliability, the Cronbach alpha value obtained was 0.64 and the intraclass correlation coefficient was 0.52. Conclusion: PHQ-2 proved to be an instrument with good psychometric properties comparable to those of PHQ-9, being superior to the latter regarding the rate of false-positive results. In addition, it is a brief instrument that elicits little resistance on the part of the patient, being inexpensive and requiring little time, thus being of important help to the treatment teams for the detection of depressive disorder, being suitable for incorporation into hospital admission protocols and thus possibly favoring more immediate interventions. (Int'l J. Psychiatry in Medicine 2012;44:141-148)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Effects of roads on wildlife and its habitat have been measured using metrics, such as the nearest road distance, road density, and effective mesh size. In this work we introduce two new indices: (1) Integral Road Effect (IRE), which measured the sum effects of points in a road at a fixed point in the forest; and (2) Average Value of the Infinitesimal Road Effect (AVIRE), which measured the average of the effects of roads at this point. IRE is formally defined as the line integral of a special function (the infinitesimal road effect) along the curves that model the roads, whereas AVIRE is the quotient of IRE by the length of the roads. Combining tools of ArcGIS software with a numerical algorithm, we calculated these and other road and habitat cover indices in a sample of points in a human-modified landscape in the Brazilian Atlantic Forest, where data on the abundance of two groups of small mammals (forest specialists and habitat generalists) were collected in the field. We then compared through the Akaike Information Criterion (AIC) a set of candidate regression models to explain the variation in small mammal abundance, including models with our two new road indices (AVIRE and IRE) or models with other road effect indices (nearest road distance, mesh size, and road density), and reference models (containing only habitat indices, or only the intercept without the effect of any variable). Compared to other road effect indices, AVIRE showed the best performance to explain abundance of forest specialist species, whereas the nearest road distance obtained the best performance to generalist species. AVIRE and habitat together were included in the best model for both small mammal groups, that is, higher abundance of specialist and generalist small mammals occurred where there is lower average road effect (less AVIRE) and more habitat. Moreover, AVIRE was not significantly correlated with habitat cover of specialists and generalists differing from the other road effect indices, except mesh size, which allows for separating the effect of roads from the effect of habitat on small mammal communities. We suggest that the proposed indices and GIS procedures could also be useful to describe other spatial ecological phenomena, such as edge effect in habitat fragments. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to rapid and continuous deforestation, recent bird surveys in the Atlantic Forest are following rapid assessment programs to accumulate significant amounts of data during short periods of time. During this study, two surveying methods were used to evaluate which technique rapidly accumulated most species (> 90% of the estimated empirical value) at lowland Atlantic Forests in the state of São Paulo, southeastern Brazil. Birds were counted during the 2008-2010 breeding seasons using 10-minute point counts and 10-species lists. Overall, point counting detected as many species as lists (79 vs. 83, respectively), and 88 points (14.7 h) detected 90% of the estimated species richness. Forty-one lists were insufficient to detect 90% of all species. However, lists accumulated species faster in a shorter time period, probably due to the nature of the point count method in which species detected while moving between points are not considered. Rapid assessment programs in these forests will rapidly detect more species using 10-species lists. Both methods shared 63% of all forest species, but this may be due to spatial and temporal mismatch between samplings of each method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work deals with some classes of linear second order partial differential operators with non-negative characteristic form and underlying non- Euclidean structures. These structures are determined by families of locally Lipschitz-continuous vector fields in RN, generating metric spaces of Carnot- Carath´eodory type. The Carnot-Carath´eodory metric related to a family {Xj}j=1,...,m is the control distance obtained by minimizing the time needed to go from two points along piecewise trajectories of vector fields. We are mainly interested in the causes in which a Sobolev-type inequality holds with respect to the X-gradient, and/or the X-control distance is Doubling with respect to the Lebesgue measure in RN. This study is divided into three parts (each corresponding to a chapter), and the subject of each one is a class of operators that includes the class of the subsequent one. In the first chapter, after recalling “X-ellipticity” and related concepts introduced by Kogoj and Lanconelli in [KL00], we show a Maximum Principle for linear second order differential operators for which we only assume a Sobolev-type inequality together with a lower terms summability. Adding some crucial hypotheses on measure and on vector fields (Doubling property and Poincar´e inequality), we will be able to obtain some Liouville-type results. This chapter is based on the paper [GL03] by Guti´errez and Lanconelli. In the second chapter we treat some ultraparabolic equations on Lie groups. In this case RN is the support of a Lie group, and moreover we require that vector fields satisfy left invariance. After recalling some results of Cinti [Cin07] about this class of operators and associated potential theory, we prove a scalar convexity for mean-value operators of L-subharmonic functions, where L is our differential operator. In the third chapter we prove a necessary and sufficient condition of regularity, for boundary points, for Dirichlet problem on an open subset of RN related to sub-Laplacian. On a Carnot group we give the essential background for this type of operator, and introduce the notion of “quasi-boundedness”. Then we show the strict relationship between this notion, the fundamental solution of the given operator, and the regularity of the boundary points.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Negli ultimi anni, un crescente numero di studiosi ha focalizzato la propria attenzione sullo sviluppo di strategie che permettessero di caratterizzare le proprietà ADMET dei farmaci in via di sviluppo, il più rapidamente possibile. Questa tendenza origina dalla consapevolezza che circa la metà dei farmaci in via di sviluppo non viene commercializzato perché ha carenze nelle caratteristiche ADME, e che almeno la metà delle molecole che riescono ad essere commercializzate, hanno comunque qualche problema tossicologico o ADME [1]. Infatti, poco importa quanto una molecola possa essere attiva o specifica: perché possa diventare farmaco è necessario che venga ben assorbita, distribuita nell’organismo, metabolizzata non troppo rapidamente, ne troppo lentamente e completamente eliminata. Inoltre la molecola e i suoi metaboliti non dovrebbero essere tossici per l’organismo. Quindi è chiaro come una rapida determinazione dei parametri ADMET in fasi precoci dello sviluppo del farmaco, consenta di risparmiare tempo e denaro, permettendo di selezionare da subito i composti più promettenti e di lasciar perdere quelli con caratteristiche negative. Questa tesi si colloca in questo contesto, e mostra l’applicazione di una tecnica semplice, la biocromatografia, per caratterizzare rapidamente il legame di librerie di composti alla sieroalbumina umana (HSA). Inoltre mostra l’utilizzo di un’altra tecnica indipendente, il dicroismo circolare, che permette di studiare gli stessi sistemi farmaco-proteina, in soluzione, dando informazioni supplementari riguardo alla stereochimica del processo di legame. La HSA è la proteina più abbondante presente nel sangue. Questa proteina funziona da carrier per un gran numero di molecole, sia endogene, come ad esempio bilirubina, tiroxina, ormoni steroidei, acidi grassi, che xenobiotici. Inoltre aumenta la solubilità di molecole lipofile poco solubili in ambiente acquoso, come ad esempio i tassani. Il legame alla HSA è generalmente stereoselettivo e ad avviene a livello di siti di legame ad alta affinità. Inoltre è ben noto che la competizione tra farmaci o tra un farmaco e metaboliti endogeni, possa variare in maniera significativa la loro frazione libera, modificandone l’attività e la tossicità. Per queste sue proprietà la HSA può influenzare sia le proprietà farmacocinetiche che farmacodinamiche dei farmaci. Non è inusuale che un intero progetto di sviluppo di un farmaco possa venire abbandonato a causa di un’affinità troppo elevata alla HSA, o a un tempo di emivita troppo corto, o a una scarsa distribuzione dovuta ad un debole legame alla HSA. Dal punto di vista farmacocinetico, quindi, la HSA è la proteina di trasporto del plasma più importante. Un gran numero di pubblicazioni dimostra l’affidabilità della tecnica biocromatografica nello studio dei fenomeni di bioriconoscimento tra proteine e piccole molecole [2-6]. Il mio lavoro si è focalizzato principalmente sull’uso della biocromatografia come metodo per valutare le caratteristiche di legame di alcune serie di composti di interesse farmaceutico alla HSA, e sul miglioramento di tale tecnica. Per ottenere una miglior comprensione dei meccanismi di legame delle molecole studiate, gli stessi sistemi farmaco-HSA sono stati studiati anche con il dicroismo circolare (CD). Inizialmente, la HSA è stata immobilizzata su una colonna di silice epossidica impaccata 50 x 4.6 mm di diametro interno, utilizzando una procedura precedentemente riportata in letteratura [7], con alcune piccole modifiche. In breve, l’immobilizzazione è stata effettuata ponendo a ricircolo, attraverso una colonna precedentemente impaccata, una soluzione di HSA in determinate condizioni di pH e forza ionica. La colonna è stata quindi caratterizzata per quanto riguarda la quantità di proteina correttamente immobilizzata, attraverso l’analisi frontale di L-triptofano [8]. Di seguito, sono stati iniettati in colonna alcune soluzioni raceme di molecole note legare la HSA in maniera enantioselettiva, per controllare che la procedura di immobilizzazione non avesse modificato le proprietà di legame della proteina. Dopo essere stata caratterizzata, la colonna è stata utilizzata per determinare la percentuale di legame di una piccola serie di inibitori della proteasi HIV (IPs), e per individuarne il sito(i) di legame. La percentuale di legame è stata calcolata attraverso il fattore di capacità (k) dei campioni. Questo parametro in fase acquosa è stato estrapolato linearmente dal grafico log k contro la percentuale (v/v) di 1-propanolo presente nella fase mobile. Solamente per due dei cinque composti analizzati è stato possibile misurare direttamente il valore di k in assenza di solvente organico. Tutti gli IPs analizzati hanno mostrato un’elevata percentuale di legame alla HSA: in particolare, il valore per ritonavir, lopinavir e saquinavir è risultato maggiore del 95%. Questi risultati sono in accordo con dati presenti in letteratura, ottenuti attraverso il biosensore ottico [9]. Inoltre, questi risultati sono coerenti con la significativa riduzione di attività inibitoria di questi composti osservata in presenza di HSA. Questa riduzione sembra essere maggiore per i composti che legano maggiormente la proteina [10]. Successivamente sono stati eseguiti degli studi di competizione tramite cromatografia zonale. Questo metodo prevede di utilizzare una soluzione a concentrazione nota di un competitore come fase mobile, mentre piccole quantità di analita vengono iniettate nella colonna funzionalizzata con HSA. I competitori sono stati selezionati in base al loro legame selettivo ad uno dei principali siti di legame sulla proteina. In particolare, sono stati utilizzati salicilato di sodio, ibuprofene e valproato di sodio come marker dei siti I, II e sito della bilirubina, rispettivamente. Questi studi hanno mostrato un legame indipendente dei PIs ai siti I e II, mentre è stata osservata una debole anticooperatività per il sito della bilirubina. Lo stesso sistema farmaco-proteina è stato infine investigato in soluzione attraverso l’uso del dicroismo circolare. In particolare, è stato monitorata la variazione del segnale CD indotto di un complesso equimolare [HSA]/[bilirubina], a seguito dell’aggiunta di aliquote di ritonavir, scelto come rappresentante della serie. I risultati confermano la lieve anticooperatività per il sito della bilirubina osservato precedentemente negli studi biocromatografici. Successivamente, lo stesso protocollo descritto precedentemente è stato applicato a una colonna di silice epossidica monolitica 50 x 4.6 mm, per valutare l’affidabilità del supporto monolitico per applicazioni biocromatografiche. Il supporto monolitico monolitico ha mostrato buone caratteristiche cromatografiche in termini di contropressione, efficienza e stabilità, oltre che affidabilità nella determinazione dei parametri di legame alla HSA. Questa colonna è stata utilizzata per la determinazione della percentuale di legame alla HSA di una serie di poliamminochinoni sviluppati nell’ambito di una ricerca sulla malattia di Alzheimer. Tutti i composti hanno mostrato una percentuale di legame superiore al 95%. Inoltre, è stata osservata una correlazione tra percentuale di legame è caratteristiche della catena laterale (lunghezza e numero di gruppi amminici). Successivamente sono stati effettuati studi di competizione dei composti in esame tramite il dicroismo circolare in cui è stato evidenziato un effetto anticooperativo dei poliamminochinoni ai siti I e II, mentre rispetto al sito della bilirubina il legame si è dimostrato indipendente. Le conoscenze acquisite con il supporto monolitico precedentemente descritto, sono state applicate a una colonna di silice epossidica più corta (10 x 4.6 mm). Il metodo di determinazione della percentuale di legame utilizzato negli studi precedenti si basa su dati ottenuti con più esperimenti, quindi è necessario molto tempo prima di ottenere il dato finale. L’uso di una colonna più corta permette di ridurre i tempi di ritenzione degli analiti, per cui la determinazione della percentuale di legame alla HSA diventa molto più rapida. Si passa quindi da una analisi a medio rendimento a una analisi di screening ad alto rendimento (highthroughput- screening, HTS). Inoltre, la riduzione dei tempi di analisi, permette di evitare l’uso di soventi organici nella fase mobile. Dopo aver caratterizzato la colonna da 10 mm con lo stesso metodo precedentemente descritto per le altre colonne, sono stati iniettati una serie di standard variando il flusso della fase mobile, per valutare la possibilità di utilizzare flussi elevati. La colonna è stata quindi impiegata per stimare la percentuale di legame di una serie di molecole con differenti caratteristiche chimiche. Successivamente è stata valutata la possibilità di utilizzare una colonna così corta, anche per studi di competizione, ed è stata indagato il legame di una serie di composti al sito I. Infine è stata effettuata una valutazione della stabilità della colonna in seguito ad un uso estensivo. L’uso di supporti cromatografici funzionalizzati con albumine di diversa origine (ratto, cane, guinea pig, hamster, topo, coniglio), può essere proposto come applicazione futura di queste colonne HTS. Infatti, la possibilità di ottenere informazioni del legame dei farmaci in via di sviluppo alle diverse albumine, permetterebbe un migliore paragone tra i dati ottenuti tramite esperimenti in vitro e i dati ottenuti con esperimenti sull’animale, facilitando la successiva estrapolazione all’uomo, con la velocità di un metodo HTS. Inoltre, verrebbe ridotto anche il numero di animali utilizzati nelle sperimentazioni. Alcuni lavori presenti in letteratura dimostrano l’affidabilita di colonne funzionalizzate con albumine di diversa origine [11-13]: l’utilizzo di colonne più corte potrebbe aumentarne le applicazioni.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present work, the multi-objective optimization by genetic algorithms is investigated and applied to heat transfer problems. Firstly, the work aims to compare different reproduction processes employed by genetic algorithms and two new promising processes are suggested. Secondly, in this work two heat transfer problems are studied under the multi-objective point of view. Specifically, the two cases studied are the wavy fins and the corrugated wall channel. Both these cases have already been studied by a single objective optimizer. Therefore, this work aims to extend the previous works in a more comprehensive study.