914 resultados para Residual autocorrelation and autocovariance matrices


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shape memory alloys (SMAs) have the ability to undergo large deformations with minimum residual strain and also the extraordinary ability to undergo reversible hysteretic shape change known as the shape memory effect. The shape memory effect of these alloys can be utilised to develop a convenient way of actively confine concrete sections to improve their shear strength, flexural ductility and ultimate strain. Most of the previous work on active confinement of concrete using SMA has been carried out on circular sections. In this study retrofitting strategies for active confinement of non-circular sections have been proposed. The proposed schemes presented in this paper are conceived with an aim to seismically retrofit beam-column joints in non-seismically designed reinforced concrete buildings. SMAs are complex materials and their material behaviour depends on number of parameters. Depending upon the alloying elements, SMAs exhibit different behaviour in different conditions and are highly sensitive to variation in temperature, phase in which it is used, loading pattern, strain rate and pre-strain conditions. Therefore, a detailed discussion on the behaviour of SMAs under different thermo-mechanical conditions is presented first.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shape memory alloys (SMAs) have the ability to undergo large deformations with minimum residual strain and also the extraordinary ability to undergo reversible hysteretic shape change known as the shape memory effect. The shape memory effect of these alloys can be utilised to develop a convenient way of actively confining concrete sections to improve their shear strength, flexural ductility and ultimate strain capacity. Most of the previous work on active confinement of concrete using SMA has been carried out on circular sections. In this study retrofitting strategies for active confinement of non-circular sections have been proposed. The proposed schemes presented in this paper are conceived with an aim to seismically retrofit a beam-column joint in non-seismically designed reinforced concrete buildings.

The complex material behaviour of SMAs depends on number of parameters. Depending upon the alloying elements, SMAs exhibit different behaviour in different conditions and are highly sensitive to variation in temperature, phase in which it is used, loading pattern, strain rate and pre-strain conditions. Therefore, a detailed discussion on the behaviour of SMAs under different thermo-mechanical conditions is presented first in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pre-processing (PP) of received symbol vector and channel matrices is an essential pre-requisite operation for Sphere Decoder (SD)-based detection of Multiple-Input Multiple-Output (MIMO) wireless systems. PP is a highly complex operation, but relative to the total SD workload it represents a relatively small fraction of the overall computational cost of detecting an OFDM MIMO frame in standards such as 802.11n. Despite this, real-time PP architectures are highly inefficient, dominating the resource cost of real-time SD architectures. This paper resolves this issue. By reorganising the ordering and QR decomposition sub operations of PP, we describe a Field Programmable Gate Array (FPGA)-based PP architecture for the Fixed Complexity Sphere Decoder (FSD) applied to 4 × 4 802.11n MIMO which reduces resource cost by 50% as compared to state-of-the-art solutions whilst maintaining real-time performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Um dos maiores avanços científicos do século XX foi o desenvolvimento de tecnologia que permite a sequenciação de genomas em larga escala. Contudo, a informação produzida pela sequenciação não explica por si só a sua estrutura primária, evolução e seu funcionamento. Para esse fim novas áreas como a biologia molecular, a genética e a bioinformática são usadas para estudar as diversas propriedades e funcionamento dos genomas. Com este trabalho estamos particularmente interessados em perceber detalhadamente a descodificação do genoma efectuada no ribossoma e extrair as regras gerais através da análise da estrutura primária do genoma, nomeadamente o contexto de codões e a distribuição dos codões. Estas regras estão pouco estudadas e entendidas, não se sabendo se poderão ser obtidas através de estatística e ferramentas bioinfomáticas. Os métodos tradicionais para estudar a distribuição dos codões no genoma e seu contexto não providenciam as ferramentas necessárias para estudar estas propriedades à escala genómica. As tabelas de contagens com as distribuições de codões, assim como métricas absolutas, estão actualmente disponíveis em bases de dados. Diversas aplicações para caracterizar as sequências genéticas estão também disponíveis. No entanto, outros tipos de abordagens a nível estatístico e outros métodos de visualização de informação estavam claramente em falta. No presente trabalho foram desenvolvidos métodos matemáticos e computacionais para a análise do contexto de codões e também para identificar zonas onde as repetições de codões ocorrem. Novas formas de visualização de informação foram também desenvolvidas para permitir a interpretação da informação obtida. As ferramentas estatísticas inseridas no modelo, como o clustering, análise residual, índices de adaptação dos codões revelaram-se importantes para caracterizar as sequências codificantes de alguns genomas. O objectivo final é que a informação obtida permita identificar as regras gerais que governam o contexto de codões em qualquer genoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os moluscos bivalves constituem um recurso haliêutico de elevada importância na economia (inter)nacional pelas suas características organolépticas, valor nutritivo e relevância na gastronomia tradicional. Não obstante, representam um produto alimentar de elevado risco para a saúde pública. A contaminação microbiológica (autóctone e antropogénica), sendo crónica nos bancos de bivalves das zonas estuarino-lagunares, constitui uma das principais preocupações associadas à segurança alimentar. Aquando da filtração inerente aos processos de respiração e alimentação, os bivalves bioacumulam passivamente microrganismos incluindo os patogénicos. A sua colocação no mercado impõe pois, prévia salubrização para níveis microbiológicos compatíveis com a legislação em vigor, salvaguardando a saúde pública. Apesar da monitorização das áreas de apanha e produção, das medidas de prevenção e da depuração, a ocorrência de surtos associados ao consumo de bivalves tem aumentado. Tal deve-se à insuficiente monitorização da contaminação microbiológica dos bivalves, contribuindo para uma gestão ineficaz do produto e consequente sub-valorização. O presente trabalho pretendeu caracterizar o estado de desenvolvimento do sector de exploração de bivalves em Portugal do ponto de vista da segurança alimentar, e analisar os aspectos cruciais da monitorização e da depuração do produto apresentando alternativas abrangentes e aplicáveis ao sector. Assim, desenvolveu-se uma metodologia de base molecular passível de adaptação à monitorização dos bivalves das zonas conquícolas, como alternativa ao método de referência vigente do Número Mais Provável que é baseado apenas na quantificação de Escherichia coli. O mexilhão (Mytilus edulis) da Ria de Aveiro, bivalve de interesse comercial a nível (inter)nacional serviu de modelo para a comparação de protocolos de extração de DNA. Esta metodologia foi desenvolvida de modo a que os métodos de extração de DNA sejam passíveis de aplicação a outras matrizes biológicas ou ambientais. Para além da detecção e quantificação directa de bactérias patogénicas, esta metodologia poderá ser aplicada à monitorização da transferência vertical microbiana nos bancos de bivalves bem como à caracterização da dinâmica espacio-temporal das populações microbianas no ambiente e à monitorização dos processos de depuração. Foi ainda abordado o potencial da aplicação de bacteriófagos ou de enzimas líticas para a optimização dos processos de purificação. O trabalho realizado e as perspectivas futuras propostas pretendem contribuir para a dinamização e requalificação do sector de exploração de bivalves através da melhoria do nível de segurança alimentar dos moluscos bivalves comercializados para alimentação humana, valorizando este recurso.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de doutoramento, Educação (História da Educação), Universidade de Lisboa, Instituto de Educação, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Database was generated during the development of a computer vision-based system for safety purposes in nuclear plants. The system aims at detecting and tracking people within a nuclear plant. Further details may be found in the related thesis. The research was developed through a cooperation between the Graduate Electrical Engineering Program of Federal University of Rio de Janeiro (PEE/COPPE, UFRJ) and the Nuclear Engineering Institute of National Commission of Nuclear Energy (IEN, CNEN). The experimental part of this research was carried out in Argonauta, a nuclear research reactor belonging to IEN. The Database is made available in the sequel. All the videos are already rectified. The Projection and Homography matrices are given in the end, for both cameras. Please, acknowledge the use of this Database in any publication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contemporaneamente o Homem depara-se com um dos grandes desafios que é o de efetivar a transição para um futuro sustentável. Assim, o setor da energia tem um papel fundamental neste processo de transição, com principal enfoque no setor dos automóveis, sendo este um setor que contribui com elevadas quantidades de gases de efeito estufa libertados para a atmosfera. Também a escassez dos recursos petrolíferos constitui um ponto fundamental no tema apresentado. Com a necessidade de combater esses problemas é que se tem vindo a tentar desenvolver combustíveis renováveis e neutros quanto às emissões. A primeira geração de biocombustíveis obtidos através de culturas agrícolas terrestres preenche em parte esses requisitos, porém, não atinge os valores da procura e ainda competem com a produção de alimentos. Daí o interesse na aposta de uma segunda geração de biocombustíveis produzidos de fontes que não pertencem à cadeia alimentar e são residuais mas, que mesmo assim não permitem satisfazer as necessidades de matériaprima. A terceira geração de biocombustíveis vem justamente responder a estas questões pois assenta em matérias-primas que não competem pela utilização do solo agrícola nem são usadas para fins alimentares, tendo produtividades areais substancialmente superiores às que as culturas convencionais ou biomassas residuais conseguem assegurar. A matéria prima de terceira geração são portanto as microalgas, cujas produtividades em biomassa são extremamente elevadas, para além de produtividades muito superiores em lípidos, hidratos de carbono e/ou outros produtos de valor elevado. No entanto, este tipo de produção de biocombustível ainda enfrenta alguns problemas técnicos que o tornam num processo dispendioso para competir economicamente com outros tipos de produção de biodiesel. Na linha do que foi dito anteriormente, este trabalho apresenta um estudo de viabilidade económica e energética do biodiesel produzido através da Chlorella vulgaris, apresentando as técnicas e resultados de cultivo da Chlorella vulgaris e posteriormente de produção do biodiesel através dos lípidos obtidos através da mesma. Para melhorar a colheita das microalgas, que é uma das fases mais dispendiosas, testou-se o aumento de pH e a adição de um floculante (Pax XL-10), sendo que o primeiro não permitiu obter resultados satisfatórios, enquanto o segundo permitiu obter resultados de rendimento na ordem dos 90%. Mesmo com a melhoria da etapa da colheita, o preço mínimo do biodiesel produzido a partir do óleo de Chlorella vulgaris, com as condições ótimas de cultivo e produtividades máximas encontradas na literatura, foi de 8,76 €/L, pois, na análise económica, o Pax XL-10 revelou-se extremamente caro para utilizar na floculação de microalgas para obtenção de um produto de baixo valor, como é o biodiesel. A não utilização da floculação reduz o preço do biodiesel para 7,85 €/L. O que se pode concluir deste trabalho é que face às técnicas utilizadas, a produção de biodiesel Chlorella vulgaris apenas, não é economicamente viável, pelo que para viabilizar a sustentabilidade do processo seria ainda necessário desenvolver mais esforços no sentido de otimizar a produção de biodiesel, eventualmente associando-a à produção de um outro biocombustível produzido a partir da biomassa extraída residual e/ou da recuperação de outros produtos de maior valor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relation algebras is one of the state-of-the-art means used by mathematicians and computer scientists for solving very complex problems. As a result, a computer algebra system for relation algebras called RelView has been developed at Kiel University. RelView works within the standard model of relation algebras. On the other hand, relation algebras do have other models which may have different properties. For example, in the standard model we always have L;L=L (the composition of two (heterogeneous) universal relations yields a universal relation). This is not true in some non-standard models. Therefore, any example in RelView will always satisfy this property even though it is not true in general. On the other hand, it has been shown that every relation algebra with relational sums and subobjects can be seen as matrix algebra similar to the correspondence of binary relations between sets and Boolean matrices. The aim of my research is to develop a new system that works with both standard and non-standard models for arbitrary relations using multiple-valued decision diagrams (MDDs). This system will implement relations as matrix algebras. The proposed structure is a library written in C which can be imported by other languages such as Java or Haskell.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper : a) the consumer’s problem is studied over two periods, the second one involving S states, and the consumer being endowed with S+1 incomes and having access to N financial assets; b) the consumer is then representable by a continuously differentiable system of demands, commodity demands, asset demands and desirabilities of incomes (the S+1 Lagrange multiplier of the S+1 constraints); c) the multipliers can be transformed into subjective Arrow prices; d) the effects of the various incomes on these Arrow prices decompose into a compensation effect (an Antonelli matrix) and a wealth effect; e) the Antonelli matrix has rank S-N, the dimension of incompleteness, if the consumer can financially adjust himself when facing income shocks; f) the matrix has rank S, if not; g) in the first case, the matrix represents a residual aversion; in the second case, a fundamental aversion; the difference between them is an aversion to illiquidity; this last relation corresponds to the Drèze-Modigliani decomposition (1972); h) the fundamental aversion decomposes also into an aversion to impatience and a risk aversion; i) the above decompositions span a third decomposition; if there exists a sure asset (to be defined, the usual definition being too specific), the fundamental aversion admits a three-component decomposition, an aversion to impatience, a residual aversion and an aversion to the illiquidity of risky assets; j) the formulas of the corresponding financial premiums are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans certaines circonstances, des actions de groupes sont plus performantes que des actions individuelles. Dans ces situations, il est préférable de former des coalitions. Ces coalitions peuvent être disjointes ou imbriquées. La littérature économique met un fort accent sur la modélisation des accords où les coalitions d’agents économiques sont des ensembles disjoints. Cependant on observe dans la vie de tous les jours que les coalitions politiques, environnementales, de libre-échange et d’assurance informelles sont la plupart du temps imbriquées. Aussi, devient-il impératif de comprendre le fonctionnement économique des coalitions imbriquées. Ma thèse développe un cadre d’analyse qui permet de comprendre la formation et la performance des coalitions même si elles sont imbriquées. Dans le premier chapitre je développe un jeu de négociation qui permet la formation de coalitions imbriquées. Je montre que ce jeu admet un équilibre et je développe un algorithme pour calculer les allocations d’équilibre pour les jeux symétriques. Je montre que toute structure de réseau peut se décomposer de manière unique en une structure de coalitions imbriquées. Sous certaines conditions, je montre que cette structure correspond à une structure d’équilibre d’un jeu sous-jacent. Dans le deuxième chapitre j’introduis une nouvelle notion de noyau dans le cas où les coalitions imbriquées sont permises. Je montre que cette notion de noyau est une généralisation naturelle de la notion de noyau de structure de coalitions. Je vais plus loin en introduisant des agents plus raffinés. J’obtiens alors le noyau de structure de coalitions imbriquées que je montre être un affinement de la première notion. Dans la suite de la thèse, j’applique les théories développées dans les deux premiers chapitres à des cas concrets. Le troisième chapitre est une application de la relation biunivoque établie dans le premier chapitre entre la formation des coalitions et la formation de réseaux. Je propose une modélisation réaliste et effective des assurances informelles. J’introduis ainsi dans la littérature économique sur les assurances informelles, quatre innovations majeures : une fusion entre l’approche par les groupes et l’approche par les réseaux sociaux, la possibilité d’avoir des organisations imbriquées d’assurance informelle, un schéma de punition endogène et enfin les externalités. Je caractérise les accords d’assurances informelles stables et j’isole les conditions qui poussent les agents à dévier. Il est admis dans la littérature que seuls les individus ayant un revenu élevé peuvent se permettre de violer les accords d’assurances informelles. Je donne ici les conditions dans lesquelles cette hypothèse tient. Cependant, je montre aussi qu’il est possible de violer cette hypothèse sous d’autres conditions réalistes. Finalement je dérive des résultats de statiques comparées sous deux normes de partage différents. Dans le quatrième et dernier chapitre, je propose un modèle d’assurance informelle où les groupes homogènes sont construits sur la base de relations de confiance préexistantes. Ces groupes sont imbriqués et représentent des ensembles de partage de risque. Cette approche est plus générale que les approches traditionnelles de groupe ou de réseau. Je caractérise les accords stables sans faire d’hypothèses sur le taux d’escompte. J’identifie les caractéristiques des réseaux stables qui correspondent aux taux d’escomptes les plus faibles. Bien que l’objectif des assurances informelles soit de lisser la consommation, je montre que des effets externes liés notamment à la valorisation des liens interpersonnels renforcent la stabilité. Je développe un algorithme à pas finis qui égalise la consommation pour tous les individus liés. Le fait que le nombre de pas soit fini (contrairement aux algorithmes à pas infinis existants) fait que mon algorithme peut inspirer de manière réaliste des politiques économiques. Enfin, je donne des résultats de statique comparée pour certaines valeurs exogènes du modèle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse part du postulat que la crise du syndicalisme résulte d’une remise en cause des identités collectives ayant légitimé jusque dans les années 70 leur représentation des travailleurs. En témoignent les mobilisations, qui se déroulent souvent en dehors des syndicats et de façon conflictuelle avec eux, de travailleuses et travailleurs longtemps minorés par les arrangements institutionnels prévalant avec la société salariale. Différents travaux dans le renouveau syndical relèvent de leur côté que les syndicats peinent à prendre en compte les besoins et aspirations de ces travailleurs car leur identité collective les entraîne à rester dans les sentiers des orientations et représentations institutionnalisées. Cependant, les auteurs se focalisent sur la façon dont le syndicalisme, et en particulier les leaders, peuvent reconstruire une représentation des travailleurs, et non sur la façon dont les identités collectives se transforment. Les études sur le syndicalisme héritent d’un débat sur les mouvements sociaux qui a abouti à scinder les approches théoriques entre celles conceptualisant les identités collectives, mais dans le cadre de théorisations contestables de l’évolution des sociétés, et celles qui sous-théorisent les identités collectives et considèrent que les mouvements sociaux émergent des processus politique et de la mobilisation des ressources. Les travaux sur le renouveau syndical reprennent généralement cette seconde approche et assimilent les mouvements de travailleurs à des organisations en considérant, implicitement, les buts de l’action collective comme donné. Or, un mouvement social est un concept ; il n’est pas réductible à une organisation, au risque sinon de perdre sa valeur heuristique, qui est de chercher à saisir les identités collectives en conflit et les stratégies associées. À partir de l’étude du cas du mouvement de travailleurs dans l’économie solidaire brésilienne, cette thèse questionne donc le « pourquoi de nouvelles identités collectives de travailleurs émergent » et le « comment ou le pourquoi des identités syndicales se transforment ou se reproduisent », lorsqu’elles sont confrontées à l’émergence de nouvelles façons de définir les dominations à combattre et les orientations. Les identités collectives sont opérationnalisées comme des matrices cognitives et normatives, ce qui permet de rendre compte de leur caractère évolutif en fonction des modalités d’interaction. L’étude de cas met en évidence que les mobilisations autonomes des travailleurs minorés sont porteuses de nouvelles définitions des problèmes et de pratiques sociales transformatrices, qui entrent en conflit avec les significations et les pratiques syndicales institutionnalisées. Elle montre que c’est à la suite d’interactions délibératives entre ces travailleurs et les syndicalistes que les identités syndicales se transforment. Cependant, la reconstitution des trajectoires de deux syndicats (de la principale centrale brésilienne) indique que le fait d’entrer dans de telles interactions ne dépend pas d’une décision rationnelle, mais de la perception (de la part des syndicats) des capacités des travailleurs à transformer le rapport au travail et au monde lorsqu’ils agissent collectivement. Un dernier résultat, corollaire, tient dans la falsification de l’hypothèse – défendue par une partie de la littérature sur le renouveau syndical – selon laquelle les syndicats, et en particulier les leaders, peuvent conduire une transformation de la représentation collective en procédant eux-mêmes à une agrégation des multiples identités collectives. Cette hypothèse, qui revient à considérer le but de l’action collective comme donné, est contredite par les données : elles montrent que, dans un tel cas, s’il y a bien des innovations institutionnelles conduites par le syndicat, ces innovations favorisent l’adaptation du syndicalisme aux mutations du capitalisme et non la transformation des rapports sociaux de domination, parce que prédominent alors les liens sociaux avec les groupes dominants, c’est-à-dire les interprétations cognitives dominantes des problèmes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study aimed at the utlisation of microbial organisms for the production of good quality chitin and chitosan. The three strains used for the study were Lactobacillus plantarum, Lactobacililus brevis and Bacillus subtilis. These strains were selected on the basis of their acid producing ability to reduce the pH of the fermenting substrates to prevent spoilage and thus caused demineralisation of the shell. Besides, the proteolytic enzymes in these strains acted on proteinaceous covering of shrimp and thus caused deprotenisation of shrimp shell waste. Thus the two processes involved in chitin production can be affected to certain extent using bacterial fermentation of shrimp shell.Optimization parameters like fermentation period, quantity of inoculum, type of sugar, concentration of sugar etc. for fermentation with three different strains were studied. For these, parameters like pH, Total titrable acidity (TTA), changes in sugar concentration, changes in microbial count, sensory changes etc. were studied.Fermentation study with Lactobacillus plantarum was continued with 20% w/v jaggery broth for 15 days. The inoculum prepared yislded a cell concentration of approximately 108 CFU/ml. In the present study, lactic acid and dilute hydrochloric acid were used for initial pH adjustment because; without adjusting the initial pH, it took more than 5 hours for the lactic acid bacteria to convert glucose to lactic acid and during this delay spoilage occurred due to putrefying enzymes active at neutral or higher pH. During the fermentation study, pH first decreased in correspondence with increase in TTA values. This showed a clear indication of acid production by the strain. This trend continued till their proteolytic activity showed an increasing trend. When the available sugar source started depleting, proteolytic activity also decreased and pH increased. This was clearly reflected in the sensory evaluation results. Lactic acid treated samples showed greater extent of demineralization and deprotenisation at the end of fermentation study than hydrochloric acid treated samples. It can be due to the effect of strong hydrochloric acid on the initial microbial count, which directly affects the fermentation process. At the end of fermentation, about 76.5% of ash was removed in lactic acid treated samples and 71.8% in hydrochloric acid treated samples; 72.8% of proteins in lactic acid treated samples and 70.6% in hydrochloric acid treated samples.The residual protein and ash in the fermented residue were reduced to permissible limit by treatment with 0.8N HCI and 1M NaOH. Characteristics of chitin like chitin content, ash content, protein content, % of N- acetylation etc. were studied. Quality characteristics like viscosity, degree of deacetylation and molecular weight of chitosan prepared were also compared. The chitosan samples prepared from lactic acid treated showed high viscosity than HCI treated samples. But degree of deacetylation is more in HCI treated samples than lactic acid treated ones. Characteristics of protein liquor obtained like its biogenic composition, amino acid composition, total volatile base nitrogen, alpha amino nitrogen etc. also were studied to find out its suitability as animal feed supplement.Optimization of fermentation parameters for Lactobacillus brevis fermentation study was also conducted and parameters were standardized. Then detailed fermentation study was done in 20%wlv jaggery broth for 17 days. Also the effect of two different acid treatments (mild HCI and lactic acid) used for initial pH adjustment on chitin production were also studied. In this study also trend of changes in pH. changes in sugar concentration ,microbial count changes were similar to Lactobacillus plantarum studies. At the end of fermentation, residual protein in the samples were only 32.48% in HCI treated samples and 31.85% in lactic acid treated samples. The residual ash content was about 33.68% in HCI treated ones and 32.52% in lactic acid treated ones. The fermented residue was converted to chitin with good characteristics by treatment with 1.2MNaOH and 1NHCI.Characteristics of chitin samples prepared were studied and extent of Nacetylation was about 84% in HCI treated chitin and 85%in lactic acid treated ones assessed from FTIR spectrum. Chitosan was prepared from these samples by usual chemical method and its extent of solubility, degree of deacetylation, viscosity and molecular weight etc were studied. The values of viscosity and molecular weight of the samples prepared were comparatively less than the chitosan prepared by Lactobacillus plantarum fermentation. Characteristics of protein liquor obtained were analyzed to determine its quality and is suitability as animal feed supplement.Another strain used for the study was Bacillus subtilis and fermentation was carried out in 20%w/v jaggery broth for 15 days. It was found that Bacillus subtilis was more efficient than other Lactobacillus species for deprotenisation and demineralization. This was mainly due to the difference in the proteolytic nature of the strains. About 84% of protein and 72% of ash were removed at the end of fermentation. Considering the statistical significance (Pand deproteinisation, we have taken 0.8N HCI for the demineralization study and 0.6M NaOH for deprotenisation study. Properties of chitin and chitosan prepared were analyzed and studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multivariate lifetime data arise in various forms including recurrent event data when individuals are followed to observe the sequence of occurrences of a certain type of event; correlated lifetime when an individual is followed for the occurrence of two or more types of events, or when distinct individuals have dependent event times. In most studies there are covariates such as treatments, group indicators, individual characteristics, or environmental conditions, whose relationship to lifetime is of interest. This leads to a consideration of regression models.The well known Cox proportional hazards model and its variations, using the marginal hazard functions employed for the analysis of multivariate survival data in literature are not sufficient to explain the complete dependence structure of pair of lifetimes on the covariate vector. Motivated by this, in Chapter 2, we introduced a bivariate proportional hazards model using vector hazard function of Johnson and Kotz (1975), in which the covariates under study have different effect on two components of the vector hazard function. The proposed model is useful in real life situations to study the dependence structure of pair of lifetimes on the covariate vector . The well known partial likelihood approach is used for the estimation of parameter vectors. We then introduced a bivariate proportional hazards model for gap times of recurrent events in Chapter 3. The model incorporates both marginal and joint dependence of the distribution of gap times on the covariate vector . In many fields of application, mean residual life function is considered superior concept than the hazard function. Motivated by this, in Chapter 4, we considered a new semi-parametric model, bivariate proportional mean residual life time model, to assess the relationship between mean residual life and covariates for gap time of recurrent events. The counting process approach is used for the inference procedures of the gap time of recurrent events. In many survival studies, the distribution of lifetime may depend on the distribution of censoring time. In Chapter 5, we introduced a proportional hazards model for duration times and developed inference procedures under dependent (informative) censoring. In Chapter 6, we introduced a bivariate proportional hazards model for competing risks data under right censoring. The asymptotic properties of the estimators of the parameters of different models developed in previous chapters, were studied. The proposed models were applied to various real life situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability analysis is a well established branch of statistics that deals with the statistical study of different aspects of lifetimes of a system of components. As we pointed out earlier that major part of the theory and applications in connection with reliability analysis were discussed based on the measures in terms of distribution function. In the beginning chapters of the thesis, we have described some attractive features of quantile functions and the relevance of its use in reliability analysis. Motivated by the works of Parzen (1979), Freimer et al. (1988) and Gilchrist (2000), who indicated the scope of quantile functions in reliability analysis and as a follow up of the systematic study in this connection by Nair and Sankaran (2009), in the present work we tried to extend their ideas to develop necessary theoretical framework for lifetime data analysis. In Chapter 1, we have given the relevance and scope of the study and a brief outline of the work we have carried out. Chapter 2 of this thesis is devoted to the presentation of various concepts and their brief reviews, which were useful for the discussions in the subsequent chapters .In the introduction of Chapter 4, we have pointed out the role of ageing concepts in reliability analysis and in identifying life distributions .In Chapter 6, we have studied the first two L-moments of residual life and their relevance in various applications of reliability analysis. We have shown that the first L-moment of residual function is equivalent to the vitality function, which have been widely discussed in the literature .In Chapter 7, we have defined percentile residual life in reversed time (RPRL) and derived its relationship with reversed hazard rate (RHR). We have discussed the characterization problem of RPRL and demonstrated with an example that the RPRL for given does not determine the distribution uniquely