991 resultados para CORE-SOFTENED MODELS
Resumo:
Aquest projecte es va iniciar amb la finalitat d’extendre i consolidar l’aplicació del Model de Pràcticum Integrador (MPI) a la majoria dels pràcticums de les titulacions de pedagogia, psicopedagogia i educació social de la Facultat de Ciències de l’Educació de la nostra universitat. Aquest Model s’havia iniciat en els darrers anys i ja s’havia comprovat la seva l’eficiència. L’MPI es fonamenta en el convenciment que l’assoliment de les competències professionals és bàsic i que aquestes es poden desenvolupar durant el pràcticum. En aquest nou model de pràctiques, els estudiants treballen en equips interdisciplinars per projectes, els tutors/es de la facultat conformen un equip de treball amb els tutors/es dels centres, dissenyen els plans d’acollida i de seguiment, i es fan les tutories de seguiment i treball en el centre entre totes les parts implicades. Els objectius plantejats en els dos anys de durada de l’MQD2006 eren: 1)Eliminar les dificultats tecnicoadministratives de la facultat que dificulten l’ampliació i consolidació del model MPI. 2)Comunicar i prestigiar el model MPI i la xarxa de centres d’excel·lència entre els estudiants, el professorat de la facultat i els propis centres. 3)Promoure la col·laboració, l’intercanvi de coneixement i els projectes d’innovació i recerca facultat-centres posant en contacte els grups de treball de la facultat i els centres, i mostrant els seus potencials. 4)Estructurar el nou model segons el model ECTs. 5)Analitzar i aprofundir en l’aplicació del treball per competències del nou model.
Resumo:
Aquest projecte consisteix en la realització d'un entorn gràfic que serveixi per generar SoCs basats en el processador soft-core OpenRISC. Aquest entorn permetrà afegir diferents components de manera dinàmica a un repositori d’IPs, mostrar i sel·leccionar qualsevol component disponible dins d’aquest repositori, amb la finalitat d’unir-los al bus del sistema i fer-los accessibles al processador OpenRISC. L’entorn també mostrarà en tot moment com va evolucionant el nostre SoC, guardarà cadascún dels projectes que es realitzen amb aquest entorn i finalment permetrà generar el SoC dissenyat.
Resumo:
Nonlinear Noisy Leaky Integrate and Fire (NNLIF) models for neurons networks can be written as Fokker-Planck-Kolmogorov equations on the probability density of neurons, the main parameters in the model being the connectivity of the network and the noise. We analyse several aspects of the NNLIF model: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case. In particular, for excitatory networks, blow-up always occurs for initial data concentrated close to the firing potential. These results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter.
Resumo:
The evolution of a quantitative phenotype is often envisioned as a trait substitution sequence where mutant alleles repeatedly replace resident ones. In infinite populations, the invasion fitness of a mutant in this two-allele representation of the evolutionary process is used to characterize features about long-term phenotypic evolution, such as singular points, convergence stability (established from first-order effects of selection), branching points, and evolutionary stability (established from second-order effects of selection). Here, we try to characterize long-term phenotypic evolution in finite populations from this two-allele representation of the evolutionary process. We construct a stochastic model describing evolutionary dynamics at non-rare mutant allele frequency. We then derive stability conditions based on stationary average mutant frequencies in the presence of vanishing mutation rates. We find that the second-order stability condition obtained from second-order effects of selection is identical to convergence stability. Thus, in two-allele systems in finite populations, convergence stability is enough to characterize long-term evolution under the trait substitution sequence assumption. We perform individual-based simulations to confirm our analytic results.
Resumo:
Network airlines have been increasingly focusing their operations on hub airports through the exploitation of connecting traffic, allowing them to take advantage of economies of traffic density, which are unequivocal in the airline industry. Less attention has been devoted to airlines' decisions on point-to-point thin routes, which could be served using different aircraft technologies and different business models. This paper examines, both theoretically and empirically, the impact on airlines' networks of the two major innovations in the airline industry in the last two decades: the regional jet technology and the low-cost business model. We show that, under certain circumstances, direct services on point-to-point thin routes can be viable and thus airlines may be interested in deviating passengers out of the hub. Keywords: regional jet technology; low-cost business model; point-to-point network; hub-and-spoke network JEL Classi…fication Numbers: L13; L2; L93
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
We propose and analyze a new solution concept, the R solution, for three-person, transferable utility, cooperative games. In the spirit of the Nash Bargaining Solution, our concept is founded on the predicted outcomes of simultaneous, two-party negotiations that would be the alternative to the grand coalition. These possibly probabilistic predictions are based on consistent beliefs. We analyze the properties of the R solution and compare it with the Shapley value and other concepts. The R solution exists and is unique. It belongs to the bargaining set and to the core whenever the latter is not empty. In fact, when the grand coalition can simply execute one of the three possible bilateral trades, the R solution is the most egalitarian selection of the bargaining set. Finally, we discuss how the R solution changes important conclusions of several well known Industrial Organization models.
Resumo:
BACKGROUND: Hepatitis C virus (HCV) infection is a major cause of morbidity in HIV infected individuals. Coinfection with HIV is associated with diminished HCV-specific immune responses and higher HCV RNA levels. AIMS: To investigate whether long-term combination antiretroviral therapy (cART) restores HCV-specific T cell responses and improves the control of HCV replication. METHODS: T cell responses were evaluated longitudinally in 80 HIV/HCV coinfected individuals by ex vivo interferon-gamma-ELISpot responses to HCV core peptides, that predominantly stimulate CD4(+) T cells. HCV RNA levels were assessed by real-time PCR in 114 individuals. RESULTS: The proportion of individuals with detectable T cell responses to HCV core peptides was 19% before starting cART, 24% in the first year on cART and increased significantly to 45% and 49% after 33 and 70 months on cART (p=0.001). HCV-specific immune responses increased in individuals with chronic (+31%) and spontaneously cleared HCV infection (+30%). Median HCV RNA levels before starting cART were 6.5 log(10) IU/ml. During long-term cART, median HCV-RNA levels slightly decreased compared to pre-cART levels (-0.3 log10 IU/ml, p=0.02). CONCLUSIONS: Successful cART is associated with increasing cellular immune responses to HCV core peptides and with a slight long-term decrease in HCV RNA levels. These findings are in line with the favourable clinical effects of cART on the natural history of hepatitis C and with the current recommendation to start cART earlier in HCV/HIV coinfected individuals.
Resumo:
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.
Resumo:
There are several experimental models describing in vivo eosinophil (EO) migration, including ip injection of a large volume of saline (SAL) or Sephadex beads (SEP). The aim of this study was to investigate the mechanisms involved in the EO migration in these two models. Two consecutive injections of SAL given 48 hr apart, induced a selective recruitment of EO into peritoneal cavity of rats, which peaked 48 hr after the last injection. SEP, when injected ip, promoted EO accumulation in rats. The phenomenom was dose-related and peaked 48 hr after SEP injection. To investigate the mediators involved in this process we showed that BW A4C, MK 886 and dexamethasone (DXA) inhibited the EO migration induced by SAL and SEP. To investigate the source of the EO chemotactic factor we showed that mast cells, macrophages (MO), but not lymphocytes, incubated in vitro in presence of SAL released a factor which induced EO migration. With SEP, only mast cells release a factor that induced EO migration, which was inhibited by BW A4C, MK 886 and DXA. Furthermore, the chemotactic activity of SAL-stimulated mast cells was inhibited by antisera against IL-5 and IL-8 (interleukin). SAL-stimulated MO were only inhibited by anti-IL-8 antibodies as well SEP-stimulated mast cells. These results suggest that the EO migration induced by SAL may be dependent on resident mast cells and MO and mediated by LTB4, IL-5 and IL-8. SEP-induced EO migration was dependent on mast cells and may be mediated by LTB4 and IL-8. Furthermore, IL-5 and IL-8 induced EO migration, which was also dependent on resident cells and mediated by LTB4 . In conclusion, EO migration induced by SAL is dependent on mast cells and MO, whereas that induced by SEP is dependent on mast cells alone. Stimulated mast cells release LTB4, IL-5 and IL-8 while MO release LTB4 and IL-8. The IL-5 and IL-8 release by the SAL or SEP-stimulated resident cells may act in an autocrine fashion, thus potentiating LTB4 release.
Resumo:
Eosinophils play a central role in the establishment and outcome of bronchial inflammation in asthma. Animal models of allergy are useful to answer questions related to mechanisms of allergic inflammation. We have used models of sensitized and boosted guinea pigs to investigate the nature of bronchial inflammation in allergic conditions. These animals develop marked bronchial infiltration composed mainly of CD4+ T-lymphocytes and eosinophils. Further provocation with antigen leads to degranulation of eosinophils and ulceration of the bronchial mucosa. Eosinophils are the first cells to increase in numbers in the mucosa after antigen challenge and depend on the expression of alpha 4 integrin to adhere to the vascular endothelium and transmigrate to the mucosa. Blockage of alpha4 integrin expression with specific antibody prevents not only the transmigration of eosinophils but also the development of bronchial hyperresponsiveness (BHR) to agonists in sensitized and challenged animals, clearly suggesting a role for this cell type in this altered functional state. Moreover, introduction of antibody against Major Basic Protein into the airways also prevents the development of BHR in similar model. BHR can also be suppressed by the use of FK506, an immunosuppressor that reduces in almost 100% the infiltration of eosinophils into the bronchi of allergic animals. These data support the concept that eosinophil is the most important pro-inflammatory factor in bronchial inflammation associated with allergy.
Resumo:
In a recent paper Bermúdez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermúdez [2009] and it is shown that the modelling of the data set can be improved considerably.