34 resultados para California bearing ratio
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We analyzed offspring sex ratio variation in Mediterranean Cory's Shearwater (Calonectris d. diomedea) during two consecutive breeding seasons in two colonies. We test for differential breeding conditions between years and colonies looking at several breeding parameters and parental condition. We then explored the relationship between offspring sex ratio and parental condition and breeding parameters. This species is sexually dimorphic with males larger and heavier than females; consequently we expected differential parental cost in rearing sexes, or a greater sensitivity of male chicks to adverse conditions, which may lead to biased sex ratios. Chicks were sexed molecularly by the amplification of the CHD genes. Offspring sex ratio did not differ from parity, either at hatching or fledging, regardless of the colony or year. However, parental body condition and breeding parameters such as egg size and breeding success were different between years and colonies. Nevertheless, neither nestling mortality nor body condition at fledging varied between years or colonies, suggesting that male and female chicks were probably not differentially affected by variability in breeding conditions.
Resumo:
Investigación producida a partir de una estancia de dos meses en el laboratorio de la Dra. Donna M. Ferriero del departamento de neurología de la University of Californai San Francisco. A partir de un modelo de lesión cerebral isquemica en ratas postnatales, se han estudiado los efectos de la interacción con integrinas en el desarrollo de la lesión como estrategia terapéutica.
Resumo:
Investigación producida a partir de una estancia en el Departamento de Agricultura de los Estados Unidos (United States Department of Agricultura - USDA), California, USA, entre el 1 de Julio y el 30 de Septiembre del 2005. Las coberturas comestibles están siendo consideradas en frutas frescas cortadas, como una estrategia para reducir los efectos perjudiciales que inflige el proceso mínimo en los tejidos vegetales. El empleo de puré de frutas constituye una excelente opción para la formación de recubrimientos o películas comestibles, ya que es una matriz constituida por polisacáridos primarios (sustancias pépticas y celulósicas) con propiedades de permeabilidad a la película y plastificantes gracias a los azúcares presentes en las frutas. Estas películas son buenas barreras contra el oxígeno en sistemas alimenticios de baja humedad, mejorando la calidad de los productos y extendiendo su vida útil. Se evalúa películas a base de puré de manzana como soporte de agentes antioxidantes y de aceites esenciales como antimicrobianos. Se estudia su permeabilidad al vapor de agua y al oxígeno, además de las propiedades de tensión, color y antimicrobianas.
Resumo:
Informe d'un Grup de Treball sobre Serveis Bibliogràfics de la University of California que presenta un seguit de recomanacions sobre els canvis que aquests serveis haurien d'implementar per millorar les seves prestacions. Els autors posen de manifest el desfasament que pateixen bona part dels serveis bibliotecaris actuals davant les prestacions que ofereixen portals com Amazon o Google. Les recomanacions s'estructuren en quatre apartats: millorar la cerca i la recuperació, redissenyar l'OPAC, adoptar noves pràctiques catalogràfiques i donar suport a la millora continua. L'informe finalitza amb l'enumeració d'una seixantena de possibles actuacions addicionals que també es van considerar i les raons per les quals finalment es van descartar.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la Graduate School of Education and Information Studies (GSEIS) de la University of California at Los Angeles (UCLA), Estats Units, entre gener i juny del 2007. En el context d’elaboració d’una futura tesi doctoral sobre la metodologia comunicativa crítica i la interacció persona-ordinador, la intenció ha estat aprofundir des d'una perspectiva internacional. La GSEIS i la UCLA en general conta amb innumerables recursos bibliogràfics així com amb professorat de reconegut prestigi internacional en la recerca vinculada a la barreja de temes com l'educació, la inclusió i la transformació social, la Comunicació, les TIC i el disseny de la Interacció Persona Ordinador; integració de disciplines en la que es mou la meva tesi doctoral. La possibilitat d'accedir a la Young Research Library, així com l'assistència a diverses conferències relacionades amb el meu àmbit d'estudi, la celebració de diverses tutories amb professorat de la GSEIS i d'altres departaments de la UCLA, i la invitació a participar del seminari de doctorat del professor Douglas Kellner, han contribuït de forma remarcable al meu projecte amb: aportacions de la literatura internacional i nombrosos exemples de bones pràctiques de projectes vinculats al Participatory Design com a metodologia en si mateixa desvinculada del Disseny Centrat en l'Usuari, un dels aspectes centrals de la meva tesi. Amb tot això vaig poder reforçar i desenvolupar quatre dels capítols de la meva dissertació, concretament els relacionats amb el context social i metodològic, i els que presenten el disseny de la Interacció Persona ordinador des d'un enfocament general així com el que es centra en el Disseny participatiu i les seves vinculacions amb la metodologia comunicativa crítica.
Resumo:
Based on the Ahumada et al. (2007, Review of Income and Wealth) critique we revise existing estimates of the size of the German underground economy. Among other things, it turns out that most of these estimates are untenable and that the tax pressure induced size of the German underground economy may be much lower than previously thought. To this extent, German policy and law makers have been misguided during the last three decades. Therefore, we introduce the Modified-Cash-Deposit-Ratio (MCDR) approach, which is not subject to the recent critique and apply it to Germany for the period 1960 to 2008. JEL: O17, Q41, C22, Keywords: underground economy, shadow economy, cash-depositratio, currency demand approach, MIMIC approach
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of California a Irvine, EEUU, entre juliol del 2007 i gener del 2008. Els termoparells són actualment els sensors de temperatura més populars i més utilitzats per a un ampli rang d’aplicacions: industrials, domèstiques, etc. Aconseguir miniaturar els dispositius fins a dimensions extremadament petites obra un ampli rang de noves aplicacions per aquests dispositius, per exemple, en el camp de la tecnologia lab-on-a-chip. En aquesta investigació, el concepte de termoparell, és a dir, dos cables de diferent metall connectats per un extrem s’ha extrapolat a l’escala nanomètrica, utilitzant nanowires com a element de construcció. Aquests nanowires s’han sintetitzat a través d’un nou procediment desenvolupat en el grup d’investigació de la Universitat de California, Irvine, que ha permès treballar amb nanowires de diferents dimensions (control independent de l’alçada i amplada) i un major grau d’èxit en la fabricació d’aquests termometres. El mètode també permet dipositar aquestes nanoestructures sobre substractes no conductors de manera controlable, simplificant notablement tot el procés de fabricació. L’obtenció d’aquests dispositius ha permès demostrar que, a part de ser bons sensors de temperatura a nivell macroscòpic (fonts de calor ambientals), també permet la determinació de temperatura a nivell microscòpic (fonts de calor focalitzada, com és el cas de feixos làser). Per a la seva caracterització ha estat necessari l’ús de tecnologia puntera (làsers, amplificadors, microscopis de forces atòmiques) i inclòs el disseny de nous dispositius. Aquests nanotermoparells presenten propietats extraordinàries, com una gran sensitivitat, gran velocitat de resposta a estímuls tèrmics, i un comportament estable vers l’ús i el temps.
Resumo:
We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.
Resumo:
We compare correspondance análisis to the logratio approach based on compositional data. We also compare correspondance análisis and an alternative approach using Hellinger distance, for representing categorical data in a contingency table. We propose a coefficient which globally measures the similarity between these approaches. This coefficient can be decomposed into several components, one component for each principal dimension, indicating the contribution of the dimensions to the difference between the two representations. These three methods of representation can produce quite similar results. One illustrative example is given
Resumo:
We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.
Resumo:
The 1994 Northridge earthquake sent ripples to insurance conpanieseverywhere. This was one in a series of natural disasters such asHurricane Andrew which together with the problems in Lloyd's of Londonhave insurance companies running for cover. This paper presents a calibration of the U.S. economy in a model with financial markets forinsurance derivatives that suggests the U.S. economy can deal with thedamage of natural catastrophe far better than one might think.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
Tissue protein hypercatabolism (TPH) is a most important feature in cancer cachexia, particularly with regard to the skeletal muscle. The rat ascites hepatoma Yoshida AH-130 is a very suitable model system for studying the mechanisms involved in the processes that lead to tissue depletion, since it induces in the host a rapid and progressive muscle waste mainly due to TPH (Tessitore, L., G. Bonelli, and F. M. Baccino. 1987. Biochem. J. 241:153-159). Detectable plasma levels of tumor necrosis factor-alpha associated with marked perturbations in the hormonal homeostasis have been shown to concur in forcing metabolism into a catabolic setting (Tessitore, L., P. Costelli, and F. M. Baccino. 1993. Br. J. Cancer. 67:15-23). The present study was directed to investigate if beta 2-adrenergic agonists, which are known to favor skeletal muscle hypertrophy, could effectively antagonize the enhanced muscle protein breakdown in this cancer cachexia model. One such agent, i.e., clenbuterol, indeed largely prevented skeletal muscle waste in AH-130-bearing rats by restoring protein degradative rates close to control values. This normalization of protein breakdown rates was achieved through a decrease of the hyperactivation of the ATP-ubiquitin-dependent proteolytic pathway, as previously demonstrated in our laboratory (Llovera, M., C. García-Martínez, N. Agell, M. Marzábal, F. J. López-Soriano, and J. M. Argilés. 1994. FEBS (Fed. Eur. Biochem. Soc.) Lett. 338:311-318). By contrast, the drug did not exert any measurable effect on various parenchymal organs, nor did it modify the plasma level of corticosterone and insulin, which were increased and decreased, respectively, in the tumor hosts. The present data give new insights into the mechanisms by which clenbuterol exerts its preventive effect on muscle protein waste and seem to warrant the implementation of experimental protocols involving the use of clenbuterol or alike drugs in the treatment of pathological states involving TPH, particularly in skeletal muscle and heart, such as in the present model of cancer cachexia.