929 resultados para Function Model
Resumo:
This paper examined the transmission mechanism of international prices of agricultural commodities into the real exchange rate in Brazil for the period from January 2000 to February 2010. We used time series models (ARIMA Model, Transfer Model, Intervention Analysis, Johansen Cointegration Test) in determination of the short and long run elasticities. Transfer Function Model results show that changes in international prices of agricultural commodities are transmitted to the real exchange rate in Brazil in the short run, however, that transmission is less than unity, thus configuring the inelastic relationship. Johansen cointegration tests show that these variables are not co-integrated, no longer converge to the long-run equilibrium. These results are in agreement Cashim et al. (2004), which also found no long run relationship between real exchange rate and commodity prices in the case of Brazil. These results show that monetary shocks have greater weight on changes of the real exchange rate than real shocks.
Resumo:
This paper tests the optimality of consumption decisions at the aggregate level taking into account popular deviations from the canonical constant-relative-risk-aversion (CRRA) utility function model-rule of thumb and habit. First, based on the critique in Carroll (2001) and Weber (2002) of the linearization and testing strategies using euler equations for consumption, we provide extensive empirical evidence of their inappropriateness - a drawback for standard rule- of-thumb tests. Second, we propose a novel approach to test for consumption optimality in this context: nonlinear estimation coupled with return aggregation, where rule-of-thumb behavior and habit are special cases of an all encompassing model. We estimated 48 euler equations using GMM. At the 5% level, we only rejected optimality twice out of 48 times. Moreover, out of 24 regressions, we found the rule-of-thumb parameter to be statistically significant only twice. Hence, lack of optimality in consumption decisions represent the exception, not the rule. Finally, we found the habit parameter to be statistically significant on four occasions out of 24.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Betanin is a natural pigment with antioxidant properties used as a food colourant. This work describes the spectrophotometric and chromatographic quantification of betanin (2S/15S) and its epimer isobetanin (2S/15R) in fresh beetroot juice, food-grade beetroot powder and betanin standard diluted in dextrin. Absorption spectra of all three samples were deconvoluted using a mixed three-function model. Food-grade beetroot powder has the largest amount of violet-red impurities, probably formed during processing. The purification of betanin from these complex matrices was carried out by seven different methods. Ion exchange chromatography was the most efficient method for the purification of betanin from all samples; however, fractions contain high amounts of salt. Reversed-phase HPLC as well as reversed-phase column chromatography also produced good results at a much faster rate. The longer retention time of isobetanin when compared to betanin in reversed-phase conditions has been investigated by means of quantum-mechanical methods. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
CAPITOLO 1 INTRODUZIONE Il lavoro presentato è relativo all’utilizzo a fini metrici di immagini satellitari storiche a geometria panoramica; in particolare sono state elaborate immagini satellitari acquisite dalla piattaforma statunitense CORONA, progettata ed impiegata essenzialmente a scopi militari tra gli anni ’60 e ’70 del secolo scorso, e recentemente soggette ad una declassificazione che ne ha consentito l’accesso anche a scopi ed utenti non militari. Il tema del recupero di immagini aeree e satellitari del passato è di grande interesse per un ampio spettro di applicazioni sul territorio, dall’analisi dello sviluppo urbano o in ambito regionale fino ad indagini specifiche locali relative a siti di interesse archeologico, industriale, ambientale. Esiste infatti un grandissimo patrimonio informativo che potrebbe colmare le lacune della documentazione cartografica, di per sé, per ovvi motivi tecnici ed economici, limitata a rappresentare l’evoluzione territoriale in modo asincrono e sporadico, e con “forzature” e limitazioni nel contenuto informativo legate agli scopi ed alle modalità di rappresentazione delle carte nel corso del tempo e per diversi tipi di applicazioni. L’immagine di tipo fotografico offre una rappresentazione completa, ancorché non soggettiva, dell’esistente e può complementare molto efficacemente il dato cartografico o farne le veci laddove questo non esista. La maggior parte del patrimonio di immagini storiche è certamente legata a voli fotogrammetrici che, a partire dai primi decenni del ‘900, hanno interessato vaste aree dei paesi più avanzati, o regioni di interesse a fini bellici. Accanto a queste, ed ovviamente su periodi più vicini a noi, si collocano le immagini acquisite da piattaforma satellitare, tra le quali rivestono un grande interesse quelle realizzate a scopo di spionaggio militare, essendo ad alta risoluzione geometrica e di ottimo dettaglio. Purtroppo, questo ricco patrimonio è ancora oggi in gran parte inaccessibile, anche se recentemente sono state avviate iniziative per permetterne l’accesso a fini civili, in considerazione anche dell’obsolescenza del dato e della disponibilità di altre e migliori fonti di informazione che il moderno telerilevamento ci propone. L’impiego di immagini storiche, siano esse aeree o satellitari, è nella gran parte dei casi di carattere qualitativo, inteso ad investigare sulla presenza o assenza di oggetti o fenomeni, e di rado assume un carattere metrico ed oggettivo, che richiederebbe tra l’altro la conoscenza di dati tecnici (per esempio il certificato di calibrazione nel caso delle camere aerofotogrammetriche) che sono andati perduti o sono inaccessibili. Va ricordato anche che i mezzi di presa dell’epoca erano spesso soggetti a fenomeni di distorsione ottica o altro tipo di degrado delle immagini che ne rendevano difficile un uso metrico. D’altra parte, un utilizzo metrico di queste immagini consentirebbe di conferire all’analisi del territorio e delle modifiche in esso intercorse anche un significato oggettivo che sarebbe essenziale per diversi scopi: per esempio, per potere effettuare misure su oggetti non più esistenti o per potere confrontare con precisione o co-registrare le immagini storiche con quelle attuali opportunamente georeferenziate. Il caso delle immagini Corona è molto interessante, per una serie di specificità che esse presentano: in primo luogo esse associano ad una alta risoluzione (dimensione del pixel a terra fino a 1.80 metri) una ampia copertura a terra (i fotogrammi di alcune missioni coprono strisce lunghe fino a 250 chilometri). Queste due caratteristiche “derivano” dal principio adottato in fase di acquisizione delle immagini stesse, vale a dire la geometria panoramica scelta appunto perché l’unica che consente di associare le due caratteristiche predette e quindi molto indicata ai fini spionaggio. Inoltre, data la numerosità e la frequenza delle missioni all’interno dell’omonimo programma, le serie storiche di questi fotogrammi permettono una ricostruzione “ricca” e “minuziosa” degli assetti territoriali pregressi, data appunto la maggior quantità di informazioni e l’imparzialità associabili ai prodotti fotografici. Va precisato sin dall’inizio come queste immagini, seppur rappresentino una risorsa “storica” notevole (sono datate fra il 1959 ed il 1972 e coprono regioni moto ampie e di grandissimo interesse per analisi territoriali), siano state molto raramente impiegate a scopi metrici. Ciò è probabilmente imputabile al fatto che il loro trattamento a fini metrici non è affatto semplice per tutta una serie di motivi che saranno evidenziati nei capitoli successivi. La sperimentazione condotta nell’ambito della tesi ha avuto due obiettivi primari, uno generale ed uno più particolare: da un lato il tentativo di valutare in senso lato le potenzialità dell’enorme patrimonio rappresentato da tali immagini (reperibili ad un costo basso in confronto a prodotti simili) e dall’altro l’opportunità di indagare la situazione territoriale locale per una zona della Turchia sud orientale (intorno al sito archeologico di Tilmen Höyük) sulla quale è attivo un progetto condotto dall’Università di Bologna (responsabile scientifico il Prof. Nicolò Marchetti del Dipartimento di Archeologia), a cui il DISTART collabora attivamente dal 2005. L’attività è condotta in collaborazione con l’Università di Istanbul ed il Museo Archeologico di Gaziantep. Questo lavoro si inserisce, inoltre, in un’ottica più ampia di quelle esposta, dello studio cioè a carattere regionale della zona in cui si trovano gli scavi archeologici di Tilmen Höyük; la disponibilità di immagini multitemporali su un ampio intervallo temporale, nonché di tipo multi sensore, con dati multispettrali, doterebbe questo studio di strumenti di conoscenza di altissimo interesse per la caratterizzazione dei cambiamenti intercorsi. Per quanto riguarda l’aspetto più generale, mettere a punto una procedura per il trattamento metrico delle immagini CORONA può rivelarsi utile all’intera comunità che ruota attorno al “mondo” dei GIS e del telerilevamento; come prima ricordato tali immagini (che coprono una superficie di quasi due milioni di chilometri quadrati) rappresentano un patrimonio storico fotografico immenso che potrebbe (e dovrebbe) essere utilizzato sia a scopi archeologici, sia come supporto per lo studio, in ambiente GIS, delle dinamiche territoriali di sviluppo di quelle zone in cui sono scarse o addirittura assenti immagini satellitari dati cartografici pregressi. Il lavoro è stato suddiviso in 6 capitoli, di cui il presente costituisce il primo. Il secondo capitolo è stato dedicato alla descrizione sommaria del progetto spaziale CORONA (progetto statunitense condotto a scopo di fotoricognizione del territorio dell’ex Unione Sovietica e delle aree Mediorientali politicamente correlate ad essa); in questa fase vengono riportate notizie in merito alla nascita e all’evoluzione di tale programma, vengono descritti piuttosto dettagliatamente gli aspetti concernenti le ottiche impiegate e le modalità di acquisizione delle immagini, vengono riportati tutti i riferimenti (storici e non) utili a chi volesse approfondire la conoscenza di questo straordinario programma spaziale. Nel terzo capitolo viene presentata una breve discussione in merito alle immagini panoramiche in generale, vale a dire le modalità di acquisizione, gli aspetti geometrici e prospettici alla base del principio panoramico, i pregi ed i difetti di questo tipo di immagini. Vengono inoltre presentati i diversi metodi rintracciabili in bibliografia per la correzione delle immagini panoramiche e quelli impiegati dai diversi autori (pochi per la verità) che hanno scelto di conferire un significato metrico (quindi quantitativo e non solo qualitativo come è accaduto per lungo tempo) alle immagini CORONA. Il quarto capitolo rappresenta una breve descrizione del sito archeologico di Tilmen Höyuk; collocazione geografica, cronologia delle varie campagne di studio che l’hanno riguardato, monumenti e suppellettili rinvenute nell’area e che hanno reso possibili una ricostruzione virtuale dell’aspetto originario della città ed una più profonda comprensione della situazione delle capitali del Mediterraneo durante il periodo del Bronzo Medio. Il quinto capitolo è dedicato allo “scopo” principe del lavoro affrontato, vale a dire la generazione dell’ortofotomosaico relativo alla zona di cui sopra. Dopo un’introduzione teorica in merito alla produzione di questo tipo di prodotto (procedure e trasformazioni utilizzabili, metodi di interpolazione dei pixel, qualità del DEM utilizzato), vengono presentati e commentati i risultati ottenuti, cercando di evidenziare le correlazioni fra gli stessi e le problematiche di diversa natura incontrate nella redazione di questo lavoro di tesi. Nel sesto ed ultimo capitolo sono contenute le conclusioni in merito al lavoro in questa sede presentato. Nell’appendice A vengono riportate le tabelle dei punti di controllo utilizzati in fase di orientamento esterno dei fotogrammi.
Resumo:
The combustion strategy in a diesel engine has an impact on the emissions, fuel consumption and the exhaust temperatures. The PM mass retained in the CPF is a function of NO2 and PM concentrations in addition to the exhaust temperatures and the flow rates. Thus the engine combustion strategy affects exhaust characteristics which has an impact on the CPF operation and PM mass retained and oxidized. In this report, a process has been developed to simulate the relationship between engine calibration, performance and HC and PM oxidation in the DOC and CPF respectively. Fuel Rail Pressure (FRP) and Start of Injection (SOI) sweeps were carried out at five steady state engine operating conditions. This data, along with data from a previously carried out surrogate HD-FTP cycle [1], was used to create a transfer function model which estimates the engine out emissions, flow rates, temperatures for varied FRP and SOI over a transient cycle. Four different calibrations (test cases) were considered in this study, which were simulated through the transfer function model and the DOC model [1, 2]. The DOC outputs were then input into a model which simulates the NO2 assisted and thermal PM oxidation inside a CPF. Finally, results were analyzed as to how engine calibration impacts the engine fuel consumption, HC oxidation in the DOC and the PM oxidation in the CPF. Also, active regeneration for various test cases was simulated and a comparative analysis of the fuel penalties involved was carried out.
Resumo:
Amphibian metamorphosis is marked by dramatic, thyroid hormone (TH)-induced changes involving gene regulation by TH receptor (TR). It has been postulated that TR-mediated gene regulation involves chromatin remodeling. In the absence of ligand, TR can repress gene expression by recruiting a histone deacetylase complex, whereas liganded TR recruits a histone acetylase complex for gene activation. Earlier studies have led us to propose a dual function model for TR during development. In premetamorphic tadpoles, unliganded TR represses transcription involving histone deacetylation. During metamorphosis, endogenous TH allows TR to activate gene expression through histone acetylation. Here using chromatin immunoprecipitation assay, we directly demonstrate TR binding to TH response genes constitutively in vivo in premetamorphic tadpoles. We further show that TH treatment leads to histone deacetylase release from TH response gene promoters. Interestingly, in whole animals, changes in histone acetylation show little correlation with the expression of TH response genes. On the other hand, in the intestine and tail, where TH response genes are known to be up-regulated more dramatically by TH than in most other organs, we demonstrate that TH treatment induces gene activation and histone H4 acetylation. These data argue for a role of histone acetylation in transcriptional regulation by TRs during amphibian development in some tissues, whereas in others changes in histone acetylation levels may play no or only a minor role, supporting the existence of important alternative mechanisms in gene regulation by TR.
Resumo:
At the level of the cochlear nucleus (CN), the auditory pathway divides into several parallel circuits, each of which provides a different representation of the acoustic signal. Here, the representation of the power spectrum of an acoustic signal is analyzed for two CN principal cells—chopper neurons of the ventral CN and type IV neurons of the dorsal CN. The analysis is based on a weighting function model that relates the discharge rate of a neuron to first- and second-order transformations of the power spectrum. In chopper neurons, the transformation of spectral level into rate is a linear (i.e., first-order) or nearly linear function. This transformation is a predominantly excitatory process involving multiple frequency components, centered in a narrow frequency range about best frequency, that usually are processed independently of each other. In contrast, type IV neurons encode spectral information linearly only near threshold. At higher stimulus levels, these neurons are strongly inhibited by spectral notches, a behavior that cannot be explained by level transformations of first- or second-order. Type IV weighting functions reveal complex excitatory and inhibitory interactions that involve frequency components spanning a wider range than that seen in choppers. These findings suggest that chopper and type IV neurons form parallel pathways of spectral information transmission that are governed by two different mechanisms. Although choppers use a predominantly linear mechanism to transmit tonotopic representations of spectra, type IV neurons use highly nonlinear processes to signal the presence of wide-band spectral features.
Resumo:
Nucleation is the first stage in any granulation process where binder liquid first comes into contact with the powder. This paper investigates the nucleation process where binder liquid is added to a fine powder with a spray nozzle. The dimensionless spray flux approach of Hapgood et al. (Powder Technol. 141 (2004) 20) is extended to account for nonuniform spray patterns and allow for overlap of nuclei granules rather than spray drops. A dimensionless nuclei distribution function which describes the effects of the design and operating parameters of the nucleation process (binder spray characteristics, the nucleation area ratio between droplets and nuclei and the powder bed velocity) on the fractional surface area coverage of nuclei on a moving powder bed is developed. From this starting point, a Monte Carlo nucleation model that simulates full nuclei size distributions as a function of the design and operating parameters that were implemented in the dimensionless nuclei distribution function is developed. The nucleation model was then used to investigate the effects of the design and operating parameters on the formed nuclei size distributions and to correlate these effects to changes of the dimensionless nuclei distribution function. Model simulations also showed that it is possible to predict nuclei size distributions beyond the drop controlled nucleation regime in Hapgood's nucleation regime map. Qualitative comparison of model simulations and experimental nucleation data showed similar shapes of the nuclei size distributions. In its current form, the nucleation model can replace the nucleation term in one-dimensional population balance models describing wet granulation processes. Implementation of more sophisticated nucleation kinetics can make the model applicable to multi-dimensional population balance models.
Resumo:
Wool tenderness is a significant problem in Australia, especially in areas where sheep graze under highly seasonal conditions. In this study, a profit function model is specified, estimated and simulated to assess the economic impact of staple strength-enhancing research on the profits of Australian woolgrowers. The model is based on a number of fundamental characteristics of the Australian wool industry and the staple-strength enhancing technology being assessed. The model consists of a system of demand and supply equations that are specified in terms of effective, rather than actual, prices. The interrelationships between the inputs and outputs are allowed for in the model in a manner that is consistent with theoretical restrictions. The adoption of the new feed management strategy results in a 4.4% increase in the expected profits of Australian wool producers in the short-run, and a 2.2% increase in expected profits in the long-run.
Resumo:
This paper investigates vertical economies between generation and distribution of electric power, and horizontal economies between different types of power generation in the U.S. electric utility industry. Our quadratic cost function model includes three generation output measures (hydro, nuclear and fossil fuels), which allows us to analyze the effect that generation mix has on vertical economies. Our results provide (sample mean) estimates of vertical economies of 8.1% and horizontal economies of 5.4%. An extensive sensitivity analysis is used to show how the scope measures vary across alternative model specifications and firm types. © 2012 Blackwell Publishing Ltd and the Editorial Board of The Journal of Industrial Economics.
Resumo:
Since 1995, Florida has been one of the leading states in the country initiating a high-stakes school accountability system. Public schools in Florida receive letter grades based on their performance on the Florida Comprehensive Assessment Test (FCAT). These school grades have significant effects on schools' reputations and funding. Consequently, the plan has been criticized for grading all schools in the same manner, without taking into account such variables as student poverty and mobility rates which are beyond the control of the school. ^ The purpose of this study was to examine the relationship of student variables (poverty and mobility rates) and teacher variables (average years of teacher experience and attained degree level) on FCAT math and reading performance. This research utilized an education production function model to examine which set of inputs (student or teacher) has a stronger influence on student academic output as measured by the FCAT. ^ The data collected for this study was from over 1500 public elementary schools in Florida that listed all pertinent information for 2 school years (1998/1999 & 1999/2000) on the Florida Department of Education's website. ^ It was concluded that student poverty, teacher average years of experience and student mobility taken together, provide a strong predictive measure of FCAT reading and math performance. However, the set of student inputs was significantly stronger than the teacher inputs. High student poverty was highly correlated with low FCAT scores. Teacher experience and student mobility rates were not nearly as strongly related to FCAT scores as was student poverty. The results of this study provide evidence for educators and other school stakeholders of the relative degree to which student and teacher variables are related to student academic achievement. The underlying reasons for these relationships will require further examination in future studies. These results raise questions for Florida's school policymakers about the educational equity of the state's accountability system and its implementation. ^
Resumo:
Historic changes in water-use management in the Florida Everglades have caused the quantity of freshwater inflow to Florida Bay to decline by approximately 60% while altering its timing and spatial distribution. Two consequences have been (1) increased salinity throughout the bay, including occurrences of hypersalinity, coupled with a decrease in salinity variability, and (2) change in benthic habitat structure. Restoration goals have been proposed to return the salinity climates (salinity and its variability) of Florida Bay to more estuarine conditions through changes in upstream water management, thereby returning seagrass species cover to a more historic state. To assess the potential for meeting those goals, we used two modeling approaches and long-term monitoring data. First, we applied the hydrological mass balance model FATHOM to predict salinity climate changes in sub-basins throughout the bay in response to a broad range of freshwater inflow from the Everglades. Second, because seagrass species exhibit different sensitivities to salinity climates, we used the FATHOM-modeled salinity climates as input to a statistical discriminant function model that associates eight seagrass community types with water quality variables including salinity, salinity variability, total organic carbon, total phosphorus, nitrate, and ammonium, as well as sediment depth and light reaching the benthos. Salinity climates in the western sub-basins bordering the Gulf of Mexico were insensitive to even the largest (5-fold) modeled increases in freshwater inflow. However, the north, northeastern, and eastern sub-basins were highly sensitive to freshwater inflow and responded to comparatively small increases with decreased salinity and increased salinity variability. The discriminant function model predicted increased occurrences ofHalodule wrightii communities and decreased occurrences of Thalassia testudinum communities in response to the more estuarine salinity climates. The shift in community composition represents a return to the historically observed state and suggests that restoration goals for Florida Bay can be achieved through restoration of freshwater inflow from the Everglades.
Resumo:
Annual mean salinity, light availability, and sediment depth to bedrock structured the submerged aquatic vegetation (SAV) communities in subtropical mangrove-lined estuaries. Three distinct SAV communities (i.e., Chara group, Halodule group, and Low SAV coverage group) were identified along the Everglades–Florida Bay ecotone and related to water quality using a discriminant function model that predicted the type of plant community at a given site from salinity, light availability, and sediment depth to bedrock. Mean salinity alone was able to correctly classify 78% of the sites and reliably separated the Chara group from the Halodule group. The addition of light availability and sediment depth to bedrock increased model accuracy to 90% and further distinguished the Chara group from the Halodule group. Light availability was uniquely valuable in separating the Chara group from the Low SAV coverage group. Regression analyses identified significant relationships between phosphorus concentration, phytoplankton abundance, and light availability and suggest that a decline in water transparency, associated with increasing salinity, may have also contributed to the historical decline of Chara communities in the region. This investigation applies relationships between environmental variables and SAV distribution and provides a case study into the application of these general principals to ecosystem management.