973 resultados para Monte - Carlo study


Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ABSTRACTINTRODUCTION: Monte Carlo simulations have been used for selecting optimal antibiotic regimens for treatment of bacterial infections. The aim of this study was to assess the pharmacokinetic and pharmacodynamic target attainment of intravenous β-lactam regimens commonly used to treat bloodstream infections (BSIs) caused by Gram-negative rod-shaped organisms in a Brazilian teaching hospital.METHODS: In total, 5,000 patients were included in the Monte Carlo simulations of distinct antimicrobial regimens to estimate the likelihood of achieving free drug concentrations above the minimum inhibitory concentration (MIC; fT > MIC) for the requisite periods to clear distinct target organisms. Microbiological data were obtained from blood culture isolates harvested in our hospital from 2008 to 2010.RESULTS: In total, 614 bacterial isolates, including Escherichia coli, Enterobacterspp., Klebsiella pneumoniae, Acinetobacter baumannii, and Pseudomonas aeruginosa, were analyzed Piperacillin/tazobactam failed to achieve a cumulative fraction of response (CFR) > 90% for any of the isolates. While standard dosing (short infusion) of β-lactams achieved target attainment for BSIs caused by E. coliand Enterobacterspp., pharmacodynamic target attainment against K. pneumoniaeisolates was only achieved with ceftazidime and meropenem (prolonged infusion). Lastly, only prolonged infusion of high-dose meropenem approached an ideal CFR against P. aeruginosa; however, no antimicrobial regimen achieved an ideal CFR against A. baumannii.CONCLUSIONS:These data reinforce the use of prolonged infusions of high-dose β-lactam antimicrobials as a reasonable strategy for the treatment of BSIs caused by multidrug resistant Gram-negative bacteria in Brazil.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RESUMO: Este trabalho teve como objetivo a determinação de esquemas de tratamento alternativos para o carcinoma da próstata com radioterapia externa (EBRT) e braquiterapia de baixa taxa de dose (LDRBT) com implantes permanentes de Iodo-125, biologicamente equivalentes aos convencionalmente usados na prática clínica, com recurso a modelos teóricos e a métodos de Monte Carlo (MC). Os conceitos de dose biológica efetiva (BED) e de dose uniforme equivalente (EUD) foram utilizados, com o modelo linear-quadrático (LQ), para a determinação de regimes de tratamento equivalentes. Numa primeira abordagem, utilizou-se a BED para determinar: 1) esquemas hipofracionados de EBRT mantendo as complicações retais tardias de regimes convencionais com doses totais de 75,6 Gy, 77,4 Gy, 79,2 Gy e 81,0 Gy; e 2) a relação entre as doses totais de EBRT e LDRBT de modo a manter a BED do regime convencional de 45 Gy de EBRT e 110 Gy de LDRBT. Numa segunda abordagem, recorreu-se ao código de MC MCNPX para a simulação de distribuições de dose de EBRT e LDRBT em dois fantomas de voxel segmentados a partir das imagens de tomografia computorizada de pacientes com carcinoma da próstata. Os resultados das simulações de EBRT e LDRBT foram somados e determinada uma EUD total de forma a obterem-se: 1) esquemas equivalentes ao tratamento convencional de 25 frações de 1,8 Gy de EBRT em combinação com 110 Gy de LDRBT; e 2) esquemas equivalentes a EUD na próstata de 67 Gy, 72 Gy, 80 Gy, 90 Gy, 100 Gy e 110 Gy. Em todos os resultados nota-se um ganho terapêutico teórico na utilização de esquemas hipofracionados de EBRT. Para uma BED no reto equivalente ao esquema convencional, tem-se um aumento de 2% na BED da próstata com menos 5 frações. Este incremento dá-se de forma cada vez mais visível à medida que se reduz o número de frações, sendo da ordem dos 10-11% com menos 20 frações e dos 35-45% com menos 40 frações. Considerando os resultados das simulações de EBRT, obteve-se uma EUD média de 107 Gy para a próstata e de 42 Gy para o reto, com o esquema convencional de 110 Gy de LDRBT, seguidos de 25 frações de 1,8 Gy de EBRT. Em termos de probabilidade de controlo tumoral (igual EUD), é equivalente a este tratamento a administração de EBRT em 66 frações de 1,8 Gy, 56 de 2 Gy, 40 de 2,5 Gy, 31 de 3 Gy, 20 de 4 Gy ou 13 de 5 Gy. Relativamente à administração de 66 frações de 1,8 Gy, a EUD generalizada no reto reduz em 6% com o recurso a frações de 2,5 Gy e em 10% com frações de 4 Gy. Determinou-se uma BED total de 162 Gy para a administração de 25 frações de 1,8 Gy de EBRT em combinação com 110 Gy de LDRBT. Variando-se a dose total de LDRBT (TDLDRBT) em função da dose total de EBRT (TDEBRT), de modo a garantir uma BED de 162 Gy, obteve-se a seguinte relação:.......... Os resultados das simulações mostram que a EUD no reto diminui com o aumento da dose total de LDRBT para dose por fração de EBRT (dEBRT) inferiores a 2, Gy e aumenta para dEBRT a partir dos 3 Gy. Para quantidades de TDLDRBT mais baixas (<50 Gy), o reto beneficia de frações maiores de EBRT. À medida que se aumenta a TDLDRBT, a EUD generalizada no reto torna-se menos dependente da dEBRT. Este trabalho mostra que é possível a utilização de diferentes regimes de tratamento para o carcinoma da próstata com radioterapia que possibilitem um ganho terapêutico, quer seja administrando uma maior dose biológica com efeitos tardios constantes, quer mantendo a dose no tumor e diminuindo a toxicidade retal. A utilização com precaução de esquemas hipofracionados de EBRT, para além do benefício terapêutico, pode trazer vantagens ao nível da conveniência para o paciente e economia de custos. Os resultados das simulações deste estudo e conversão para doses de efeito biológico para o tratamento do carcinoma da próstata apresentam linhas de orientação teórica de interesse para novos ensaios clínicos. --------------------------------------------------ABSTRACT: The purpose of this work was to determine alternative radiotherapy regimens for the treatment of prostate cancer using external beam radiotherapy (EBRT) and low dose-rate brachytherapy (LDRBT) with Iodine-125 permanent implants which are biologically equivalent to conventional clinical treatments, by the use of theoretical models and Monte Carlo techniques. The concepts of biological effective dose (BED) and equivalent uniform dose (EUD), together with the linear-quadratic model (LQ), were used for determining equivalent treatment regimens. In a first approach, the BED concept was used to determine: 1) hypofractionated schemes of EBRT maintaining late rectal complications as with the conventional regimens with total doses of 75.6 Gy, 77.4 Gy, 79.2 Gy and 81.0 Gy; and 2) the relationship between total doses of EBRT and LDRBT in order to keep the BED of the conventional treatment of 45 Gy of EBRT and 110 Gy of LDRBT. In a second approach, the MC code MCNPX was used for simulating dose distributions of EBRT and LDRBT in two voxel phantoms segmented from the computed tomography of patients with prostate cancer. The results of the simulations of EBRT and LDRBT were added up and given an overall EUD in order to obtain: 1) equivalent to conventional treatment regimens of 25 fraction of 1.8 Gy of EBRT in combination with 110Gy of LDRBT; and 2) equivalent schemes of EUD of 67 Gy, 72 Gy, 80 Gy, 90 Gy, 100 Gy, and 110Gy to the prostate. In all the results it is noted a therapeutic gain using hypofractionated EBRT schemes. For a rectal BED equivalent to the conventional regimen, an increment of 2% in the prostate BED was achieved with less 5 fractions. This increase is visibly higher as the number of fractions decrease, amounting 10-11% with less 20 fractions and 35-45% with less 20 fractions. Considering the results of the EBRT simulations an average EUD of 107 Gy was achieved for the prostate and of 42 Gy for the rectum with the conventional scheme of 110 Gy of LDRBT followed by 25 fractions of 1.8 Gy of EBRT. In terms of tumor control probability (same EUD) it is equivalent to this treatment, for example, delivering the EBRT in 66 fractions of 1.8 Gy, 56 fractions of 2 Gy, 40 fractions of 2.5 Gy, 31 fractions of 3 Gy, 20 fractions of 4 Gy or 13 fractions of 5 Gy. Regarding the use of 66 fractions of 1.8 Gy, the rectum EUD is reduced to 6% with 2.5 Gy per fraction and to 10% with 4 Gy. A total BED of 162 Gy was achieved for the delivery of 25 fractions of 1.8 Gy of EBRT in combination with 110 Gy of LDRBT. By varying the total dose of LDRBT (TDLDRBT) with the total dose of EBRT (TDEBRT) so as to ensure a BED of 162 Gy, the following relationship was obtained: ....... The simulation results show that the rectum EUD decreases with the increase of the TDLDRBT, for EBRT dose per fracion (dEBRT) less than 2.5 Gy and increases for dEBRT above 3 Gy. For lower amounts of TDLDRBT (< 50Gy), the rectum benefits of larger EBRT fractions. As the TDLDRBT increases, the rectum gEUD becomes less dependent on the dEBRT. The use of different regimens which enable a therapeutic gain, whether deivering a higher dose with the same late biological effects or maintaining the dose to the tumor and reducing rectal toxicity is possible. The use with precaution of hypofractionated regimens, in addition to the therapeutic benefit, can bring advantages in terms of convenience for the patient and cost savings. The simulation results of this study together with the biological dose conversion for the treatment of prostate cancer serve as guidelines of interest for new clinical trials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Extreme value models are widely used in different areas. The Birnbaum–Saunders distribution is receiving considerable attention due to its physical arguments and its good properties. We propose a methodology based on extreme value Birnbaum–Saunders regression models, which includes model formulation, estimation, inference and checking. We further conduct a simulation study for evaluating its performance. A statistical analysis with real-world extreme value environmental data using the methodology is provided as illustration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the temperature dependent magnetic susceptibility of a strained graphene quantum dot by using the determinant quantum Monte Carlo method. Within the Hubbard model on a honeycomb lattice, our unbiased numerical results show that a relative small interaction $U$ may lead to a edge ferromagnetic like behavior in the strained graphene quantum dot, and a possible room temperature transition is suggested. Around half filling, the ferromagnetic fluctuations at the zigzag edge is strengthened both markedly by the on-site Coulomb interaction and the strain, especially in low temperature region. The resultant strongly enhanced ferromagnetic like behavior may be important for the development of many applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes a bootstrap artificial neural network based panel unit root test in a dynamic heterogeneous panel context. An application to a panel of bilateral real exchange rate series with the US Dollar from the 20 major OECD countries is provided to investigate the Purchase Power Parity (PPP). The combination of neural network and bootstrapping significantly changes the findings of the economic study in favour of PPP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The recent developments in high magnetic field 13C magnetic resonance spectroscopy with improved localization and shimming techniques have led to important gains in sensitivity and spectral resolution of 13C in vivo spectra in the rodent brain, enabling the separation of several 13C isotopomers of glutamate and glutamine. In this context, the assumptions used in spectral quantification might have a significant impact on the determination of the 13C concentrations and the related metabolic fluxes. In this study, the time domain spectral quantification algorithm AMARES (advanced method for accurate, robust and efficient spectral fitting) was applied to 13 C magnetic resonance spectroscopy spectra acquired in the rat brain at 9.4 T, following infusion of [1,6-(13)C2 ] glucose. Using both Monte Carlo simulations and in vivo data, the goal of this work was: (1) to validate the quantification of in vivo 13C isotopomers using AMARES; (2) to assess the impact of the prior knowledge on the quantification of in vivo 13C isotopomers using AMARES; (3) to compare AMARES and LCModel (linear combination of model spectra) for the quantification of in vivo 13C spectra. AMARES led to accurate and reliable 13C spectral quantification similar to those obtained using LCModel, when the frequency shifts, J-coupling constants and phase patterns of the different 13C isotopomers were included as prior knowledge in the analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A cryo-electron microscopy study of supercoiled DNA molecules freely suspended in cryo-vitrified buffer was combined with Monte Carlo simulations and gel electrophoretic analysis to investigate the role of intersegmental electrostatic repulsion in determining the shape of supercoiled DNA molecules. It is demonstrated here that a decrease of DNA-DNA repulsion by increasing concentrations of counterions causes a higher fraction of the linking number deficit to be partitioned into writhe. When counterions reach concentrations likely to be present under in vivo conditions, naturally supercoiled plasmids adopt a tightly interwound conformation. In these tightly supercoiled DNA molecules the opposing segments of interwound superhelix seem to directly contact each other. This form of supercoiling, where two DNA helices interact laterally, may represent an important functional state of DNA. In the particular case of supercoiled minicircles (178 bp) the delta Lk = -2 topoisomers undergo a sharp structural transition from almost planar circles in low salt buffers to strongly writhed "figure-eight" conformations in buffers containing neutralizing concentrations of counterions. Possible implications of this observed structural transition in DNA are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to develop a two-compartment metabolic model of brain metabolism to assess oxidative metabolism from [1-(11)C] acetate radiotracer experiments, using an approach previously applied in (13)C magnetic resonance spectroscopy (MRS), and compared with an one-tissue compartment model previously used in brain [1-(11)C] acetate studies. Compared with (13)C MRS studies, (11)C radiotracer measurements provide a single uptake curve representing the sum of all labeled metabolites, without chemical differentiation, but with higher temporal resolution. The reliability of the adjusted metabolic fluxes was analyzed with Monte-Carlo simulations using synthetic (11)C uptake curves, based on a typical arterial input function and previously published values of the neuroglial fluxes V(tca)(g), V(x), V(nt), and V(tca)(n) measured in dynamic (13)C MRS experiments. Assuming V(x)(g)=10 × V(tca)(g) and V(x)(n)=V(tca)(n), it was possible to assess the composite glial tricarboxylic acid (TCA) cycle flux V(gt)(g) (V(gt)(g)=V(x)(g) × V(tca)(g)/(V(x)(g)+V(tca)(g))) and the neurotransmission flux V(nt) from (11)C tissue-activity curves obtained within 30 minutes in the rat cortex with a beta-probe after a bolus infusion of [1-(11)C] acetate (n=9), resulting in V(gt)(g)=0.136±0.042 and V(nt)=0.170±0.103 μmol/g per minute (mean±s.d. of the group), in good agreement with (13)C MRS measurements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oscillations have been increasingly recognized as a core property of neural responses that contribute to spontaneous, induced, and evoked activities within and between individual neurons and neural ensembles. They are considered as a prominent mechanism for information processing within and communication between brain areas. More recently, it has been proposed that interactions between periodic components at different frequencies, known as cross-frequency couplings, may support the integration of neuronal oscillations at different temporal and spatial scales. The present study details methods based on an adaptive frequency tracking approach that improve the quantification and statistical analysis of oscillatory components and cross-frequency couplings. This approach allows for time-varying instantaneous frequency, which is particularly important when measuring phase interactions between components. We compared this adaptive approach to traditional band-pass filters in their measurement of phase-amplitude and phase-phase cross-frequency couplings. Evaluations were performed with synthetic signals and EEG data recorded from healthy humans performing an illusory contour discrimination task. First, the synthetic signals in conjunction with Monte Carlo simulations highlighted two desirable features of the proposed algorithm vs. classical filter-bank approaches: resilience to broad-band noise and oscillatory interference. Second, the analyses with real EEG signals revealed statistically more robust effects (i.e. improved sensitivity) when using an adaptive frequency tracking framework, particularly when identifying phase-amplitude couplings. This was further confirmed after generating surrogate signals from the real EEG data. Adaptive frequency tracking appears to improve the measurements of cross-frequency couplings through precise extraction of neuronal oscillations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

MOTIVATION: Regulatory gene networks contain generic modules such as feedback loops that are essential for the regulation of many biological functions. The study of the stochastic mechanisms of gene regulation is instrumental for the understanding of how cells maintain their expression at levels commensurate with their biological role, as well as to engineer gene expression switches of appropriate behavior. The lack of precise knowledge on the steady-state distribution of gene expression requires the use of Gillespie algorithms and Monte-Carlo approximations. METHODOLOGY: In this study, we provide new exact formulas and efficient numerical algorithms for computing/modeling the steady-state of a class of self-regulated genes, and we use it to model/compute the stochastic expression of a gene of interest in an engineered network introduced in mammalian cells. The behavior of the genetic network is then analyzed experimentally in living cells. RESULTS: Stochastic models often reveal counter-intuitive experimental behaviors, and we find that this genetic architecture displays a unimodal behavior in mammalian cells, which was unexpected given its known bimodal response in unicellular organisms. We provide a molecular rationale for this behavior, and we implement it in the mathematical picture to explain the experimental results obtained from this network.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.