380 resultados para Dividend Imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE The implementation of genomic-based medicine is hindered by unresolved questions regarding data privacy and delivery of interpreted results to health-care practitioners. We used DNA-based prediction of HIV-related outcomes as a model to explore critical issues in clinical genomics. METHODS We genotyped 4,149 markers in HIV-positive individuals. Variants allowed for prediction of 17 traits relevant to HIV medical care, inference of patient ancestry, and imputation of human leukocyte antigen (HLA) types. Genetic data were processed under a privacy-preserving framework using homomorphic encryption, and clinical reports describing potentially actionable results were delivered to health-care providers. RESULTS A total of 230 patients were included in the study. We demonstrated the feasibility of encrypting a large number of genetic markers, inferring patient ancestry, computing monogenic and polygenic trait risks, and reporting results under privacy-preserving conditions. The average execution time of a multimarker test on encrypted data was 865 ms on a standard computer. The proportion of tests returning potentially actionable genetic results ranged from 0 to 54%. CONCLUSIONS The model of implementation presented herein informs on strategies to deliver genomic test results for clinical care. Data encryption to ensure privacy helps to build patient trust, a key requirement on the road to genomic-based medicine.Genet Med advance online publication 14 January 2016Genetics in Medicine (2016); doi:10.1038/gim.2015.167.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Missing outcome data are common in clinical trials and despite a well-designed study protocol, some of the randomized participants may leave the trial early without providing any or all of the data, or may be excluded after randomization. Premature discontinuation causes loss of information, potentially resulting in attrition bias leading to problems during interpretation of trial findings. The causes of information loss in a trial, known as mechanisms of missingness, may influence the credibility of the trial results. Analysis of trials with missing outcome data should ideally be handled with intention to treat (ITT) rather than per protocol (PP) analysis. However, true ITT analysis requires appropriate assumptions and imputation of missing data. Using a worked example from a published dental study, we highlight the key issues associated with missing outcome data in clinical trials, describe the most recognized approaches to handling missing outcome data, and explain the principles of ITT and PP analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The consumption capital asset pricing model is the standard economic model used to capture stock market behavior. However, empirical tests have pointed out to its inability to account quantitatively for the high average rate of return and volatility of stocks over time for plausible parameter values. Recent research has suggested that the consumption of stockholders is more strongly correlated with the performance of the stock market than the consumption of non-stockholders. We model two types of agents, non-stockholders with standard preferences and stock holders with preferences that incorporate elements of the prospect theory developed by Kahneman and Tversky (1979). In addition to consumption, stockholders consider fluctuations in their financial wealth explicitly when making decisions. Data from the Panel Study of Income Dynamics are used to calibrate the labor income processes of the two types of agents. Each agent faces idiosyncratic shocks to his labor income as well as aggregate shocks to the per-share dividend but markets are incomplete and agents cannot hedge consumption risks completely. In addition, consumers face both borrowing and short-sale constraints. Our results show that in equilibrium, agents hold different portfolios. Our model is able to generate a time-varying risk premium of about 5.5% while maintaining a low risk free rate, thus suggesting a plausible explanation for the equity premium puzzle reported by Mehra and Prescott (1985).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this dissertation was to estimate HIV incidence among the individuals who had HIV tests performed at the Houston Department of Health and Human Services (HDHHS) public health laboratory, and to examine the prevalence of HIV and AIDS concurrent diagnoses among HIV cases reported between 2000 and 2007 in Houston/Harris County. ^ The first study in this dissertation estimated the cumulative HIV incidence among the individuals testing at Houston public health laboratory using Serologic Testing Algorithms for Recent HIV Seroconversion (STARHS) during the two year study period (June 1, 2005 to May 31, 2007). The HIV incidence was estimated using two independently developed statistical imputation methods, one developed by the Centers for Disease Control and Prevention (CDC), and the other developed by HDHHS. Among the 54,394 persons who tested for HIV during the study period, 942 tested HIV positive (positivity rate=1.7%). Of these HIV positives, 448 (48%) were newly reported to the Houston HIV/AIDS Reporting System (HARS) and 417 of these 448 blood specimens (93%) were available for STARHS testing. The STARHS results showed 139 (33%) out of the 417 specimens were newly infected with HIV. Using both the CDC and HDHHS methods, the estimated cumulative HIV incidences over the two-year study period were similar: 862 per 100,000 persons (95% CI: 655-1,070) by CDC method, and 925 per 100,000 persons (95% CI: 908-943) by HDHHS method. Consistent with the national finding, this study found African Americans, and men who have sex with men (MSM) accounted for most of the new HIV infections among the individuals testing at Houston public health laboratory. Using CDC statistical method, this study also found the highest cumulative HIV incidence (2,176 per 100,000 persons [95%CI: 1,536-2,798]) was among those who tested in the HIV counseling and testing sites, compared to the sexually transmitted disease clinics (1,242 per 100,000 persons [95%CI: 871-1,608]) and city health clinics (215 per 100,000 persons [95%CI: 80-353]. This finding suggested the HIV counseling and testing sites in Houston were successful in reaching high risk populations and testing them early for HIV. In addition, older age groups had higher cumulative HIV incidence, but accounted for smaller proportions of new HIV infections. The incidence in the 30-39 age group (994 per 100,000 persons [95%CI: 625-1,363]) was 1.5 times the incidence in 13-29 age group (645 per 100,000 persons [95%CI: 447-840]); the incidences in 40-49 age group (1,371 per 100,000 persons [95%CI: 765-1,977]) and 50 or above age groups (1,369 per 100,000 persons [95%CI: 318-2,415]) were 2.1 times compared to the youngest 13-29 age group. The increased HIV incidence in older age groups suggested that persons 40 or above were still at risk to contract HIV infections. HIV prevention programs should encourage more people who are age 40 and above to test for HIV. ^ The second study investigated concurrent diagnoses of HIV and AIDS in Houston. Concurrent HIV/AIDS diagnosis is defined as AIDS diagnosis within three months of HIV diagnosis. This study found about one-third of the HIV cases were diagnosed with HIV and AIDS concurrently (within three months) in Houston/Harris County. Using multivariable logistic regression analysis, this study found being male, Hispanic, older, and diagnosed in the private sector of care were positively associated with concurrent HIV and AIDS diagnoses. By contrast, men who had sex with men and also used injection drugs (MSM/IDU) were 0.64 times (95% CI: 0.44-0.93) less likely to have concurrent HIV and AIDS diagnoses. A sensitivity analysis comparing difference durations of elapsed time for concurrent HIV and AIDS diagnosis definitions (1-month, 3-month, and 12-month cut-offs) affected the effect size of the odds ratios, but not the direction. ^ The results of these two studies, one describing characteristics of the individuals who were newly infected with HIV, and the other study describing persons who were diagnosed with HIV and AIDS concurrently, can be used as a reference for HIV prevention program planning in Houston/Harris County. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Logistic regression is one of the most important tools in the analysis of epidemiological and clinical data. Such data often contain missing values for one or more variables. Common practice is to eliminate all individuals for whom any information is missing. This deletion approach does not make efficient use of available information and often introduces bias.^ Two methods were developed to estimate logistic regression coefficients for mixed dichotomous and continuous covariates including partially observed binary covariates. The data were assumed missing at random (MAR). One method (PD) used predictive distribution as weight to calculate the average of the logistic regressions performing on all possible values of missing observations, and the second method (RS) used a variant of resampling technique. Additional seven methods were compared with these two approaches in a simulation study. They are: (1) Analysis based on only the complete cases, (2) Substituting the mean of the observed values for the missing value, (3) An imputation technique based on the proportions of observed data, (4) Regressing the partially observed covariates on the remaining continuous covariates, (5) Regressing the partially observed covariates on the remaining continuous covariates conditional on response variable, (6) Regressing the partially observed covariates on the remaining continuous covariates and response variable, and (7) EM algorithm. Both proposed methods showed smaller standard errors (s.e.) for the coefficient involving the partially observed covariate and for the other coefficients as well. However, both methods, especially PD, are computationally demanding; thus for analysis of large data sets with partially observed covariates, further refinement of these approaches is needed. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My dissertation focuses mainly on Bayesian adaptive designs for phase I and phase II clinical trials. It includes three specific topics: (1) proposing a novel two-dimensional dose-finding algorithm for biological agents, (2) developing Bayesian adaptive screening designs to provide more efficient and ethical clinical trials, and (3) incorporating missing late-onset responses to make an early stopping decision. Treating patients with novel biological agents is becoming a leading trend in oncology. Unlike cytotoxic agents, for which toxicity and efficacy monotonically increase with dose, biological agents may exhibit non-monotonic patterns in their dose-response relationships. Using a trial with two biological agents as an example, we propose a phase I/II trial design to identify the biologically optimal dose combination (BODC), which is defined as the dose combination of the two agents with the highest efficacy and tolerable toxicity. A change-point model is used to reflect the fact that the dose-toxicity surface of the combinational agents may plateau at higher dose levels, and a flexible logistic model is proposed to accommodate the possible non-monotonic pattern for the dose-efficacy relationship. During the trial, we continuously update the posterior estimates of toxicity and efficacy and assign patients to the most appropriate dose combination. We propose a novel dose-finding algorithm to encourage sufficient exploration of untried dose combinations in the two-dimensional space. Extensive simulation studies show that the proposed design has desirable operating characteristics in identifying the BODC under various patterns of dose-toxicity and dose-efficacy relationships. Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multi-arm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while at the same time allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while at the same time providing higher power to identify the best treatment at the end of the trial. Phase II trial studies usually are single-arm trials which are conducted to test the efficacy of experimental agents and decide whether agents are promising to be sent to phase III trials. Interim monitoring is employed to stop the trial early for futility to avoid assigning unacceptable number of patients to inferior treatments. We propose a Bayesian single-arm phase II design with continuous monitoring for estimating the response rate of the experimental drug. To address the issue of late-onset responses, we use a piece-wise exponential model to estimate the hazard function of time to response data and handle the missing responses using the multiple imputation approach. We evaluate the operating characteristics of the proposed method through extensive simulation studies. We show that the proposed method reduces the total length of the trial duration and yields desirable operating characteristics for different physician-specified lower bounds of response rate with different true response rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maximizing data quality may be especially difficult in trauma-related clinical research. Strategies are needed to improve data quality and assess the impact of data quality on clinical predictive models. This study had two objectives. The first was to compare missing data between two multi-center trauma transfusion studies: a retrospective study (RS) using medical chart data with minimal data quality review and the PRospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study with standardized quality assurance. The second objective was to assess the impact of missing data on clinical prediction algorithms by evaluating blood transfusion prediction models using PROMMTT data. RS (2005-06) and PROMMTT (2009-10) investigated trauma patients receiving ≥ 1 unit of red blood cells (RBC) from ten Level I trauma centers. Missing data were compared for 33 variables collected in both studies using mixed effects logistic regression (including random intercepts for study site). Massive transfusion (MT) patients received ≥ 10 RBC units within 24h of admission. Correct classification percentages for three MT prediction models were evaluated using complete case analysis and multiple imputation based on the multivariate normal distribution. A sensitivity analysis for missing data was conducted to estimate the upper and lower bounds of correct classification using assumptions about missing data under best and worst case scenarios. Most variables (17/33=52%) had <1% missing data in RS and PROMMTT. Of the remaining variables, 50% demonstrated less missingness in PROMMTT, 25% had less missingness in RS, and 25% were similar between studies. Missing percentages for MT prediction variables in PROMMTT ranged from 2.2% (heart rate) to 45% (respiratory rate). For variables missing >1%, study site was associated with missingness (all p≤0.021). Survival time predicted missingness for 50% of RS and 60% of PROMMTT variables. MT models complete case proportions ranged from 41% to 88%. Complete case analysis and multiple imputation demonstrated similar correct classification results. Sensitivity analysis upper-lower bound ranges for the three MT models were 59-63%, 36-46%, and 46-58%. Prospective collection of ten-fold more variables with data quality assurance reduced overall missing data. Study site and patient survival were associated with missingness, suggesting that data were not missing completely at random, and complete case analysis may lead to biased results. Evaluating clinical prediction model accuracy may be misleading in the presence of missing data, especially with many predictor variables. The proposed sensitivity analysis estimating correct classification under upper (best case scenario)/lower (worst case scenario) bounds may be more informative than multiple imputation, which provided results similar to complete case analysis.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The purpose of this study is to compare the stages of breast cancer presented between the insured and uninsured patients diagnosed at The Rose, an active non-profit breast healthcare organization to determine if uninsured patients present with more advanced stage breast cancer as compared to their insured counterparts. ^ Study Design: Retrospective cross-sectional study. ^ Methods: The study included 1,265 patients who received breast healthcare services and were diagnosed with breast cancer at The Rose between FY 2007 and FY 2012. 738 of the patients in the study were presumably uninsured since their breast healthcare services were sponsored through various funding sources and they were navigated into treatment through The Rose patient navigation program. We compared breast cancer stages for women who had insurance with those who did not have insurance. The effects of age and race/ethnicity along with the insurance status on the stage of reast cancer diagnosis were also analyzed. We calculated the odds ratio using the contingency tables; and estimated odds ratios (ORs) and 95% confidence intervals (CIs) using ordinal logistic regression by applying multiple imputation method for missing tumor stage data. ^ Results: The ordered logistic regression analysis with ordered tumor stage as dependent variable and uninsured as independent variable gave us an odds ratio of 1.73 (OR=1.73; p-value<0.05; 95% CI: 1.36 - 2.12). ^ Conclusions: Insurance status is a strong predictor of stage of breast cancer diagnosed among women seen at The Rose. Uninsured women seen at The Rose are almost twice as likely to present at a advanced stage of breast cancer as opposed to their insured counterparts.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Malignancies arising in the large bowel cause the second largest number of deaths from cancer in the Western World. Despite progresses made during the last decades, colorectal cancer remains one of the most frequent and deadly neoplasias in the western countries. Methods A genomic study of human colorectal cancer has been carried out on a total of 31 tumoral samples, corresponding to different stages of the disease, and 33 non-tumoral samples. The study was carried out by hybridisation of the tumour samples against a reference pool of non-tumoral samples using Agilent Human 1A 60-mer oligo microarrays. The results obtained were validated by qRT-PCR. In the subsequent bioinformatics analysis, gene networks by means of Bayesian classifiers, variable selection and bootstrap resampling were built. The consensus among all the induced models produced a hierarchy of dependences and, thus, of variables. Results After an exhaustive process of pre-processing to ensure data quality--lost values imputation, probes quality, data smoothing and intraclass variability filtering--the final dataset comprised a total of 8, 104 probes. Next, a supervised classification approach and data analysis was carried out to obtain the most relevant genes. Two of them are directly involved in cancer progression and in particular in colorectal cancer. Finally, a supervised classifier was induced to classify new unseen samples. Conclusions We have developed a tentative model for the diagnosis of colorectal cancer based on a biomarker panel. Our results indicate that the gene profile described herein can discriminate between non-cancerous and cancerous samples with 94.45% accuracy using different supervised classifiers (AUC values in the range of 0.997 and 0.955)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La vulnerabilidad de los sistemas ganaderos de pastoreo pone en evidencia la necesidad de herramientas para evaluar y mitigar los efectos de la sequía. El avance en la teledetección ha despertado el interés por explotar potenciales aplicaciones, y está dando lugar a un intenso desarrollo de innovaciones en distintos campos. Una de estas áreas es la gestión del riesgo climático, en donde la utilización de índices de vegetación permite la evaluación de la sequía. En esta investigación, se analiza el impacto de la sequía y se evalúa el potencial de nuevas tecnologías como la teledetección para la gestión del riesgo de sequía en sistemas de ganadería extensiva. Para ello, se desarrollan tres aplicaciones: (i) evaluar el impacto económico de la sequía en una explotación ganadera extensiva de la dehesa de Andalucía, (ii) elaborar mapas de vulnerabilidad a la sequía en pastos de Chile y (iii) diseñar y evaluar el potencial de un seguro indexado para sequía en pastos en la región de Coquimbo en Chile. En la primera aplicación, se diseña un modelo dinámico y estocástico que integra aspectos climáticos, ecológicos, agronómicos y socioeconómicos para evaluar el riesgo de sequía. El modelo simula una explotación ganadera tipo de la dehesa de Andalucía para el período 1999-2010. El método de Análisis Histórico y la simulación de MonteCarlo se utilizan para identificar los principales factores de riesgo de la explotación, entre los que destacan, los periodos de inicios del verano e inicios de invierno. Los resultados muestran la existencia de un desfase temporal entre el riesgo climático y riesgo económico, teniendo este último un periodo de duración más extenso en el tiempo. También, revelan que la intensidad, frecuencia y duración son tres atributos cruciales que determinan el impacto económico de la sequía. La estrategia de reducción de la carga ganadera permite aminorar el riesgo, pero conlleva una disminución en el margen bruto de la explotación. La segunda aplicación está dedicada a la elaboración de mapas de vulnerabilidad a la sequia en pastos de Chile. Para ello, se propone y desarrolla un índice de riesgo económico (IRESP) sencillo de interpretar y replicable, que integra factores de riesgo y estrategias de adaptación para obtener una medida del Valor en Riesgo, es decir, la máxima pérdida esperada en un año con un nivel de significación del 5%.La representación espacial del IRESP pone en evidencia patrones espaciales y diferencias significativas en la vulnerabilidad a la sequía a lo largo de Chile. Además, refleja que la vulnerabilidad no siempre esta correlacionada con el riesgo climático y demuestra la importancia de considerar las estrategias de adaptación. Las medidas de autocorrelación espacial revelan que el riesgo sistémico es considerablemente mayor en el sur que en el resto de zonas. Los resultados demuestran que el IRESP transmite información pertinente y, que los mapas de vulnerabilidad pueden ser una herramienta útil en el diseño de políticas y toma de decisiones para la gestión del riesgo de sequía. La tercera aplicación evalúa el potencial de un seguro indexado para sequía en pastos en la región de Coquimbo en Chile. Para lo cual, se desarrolla un modelo estocástico para estimar la prima actuarialmente justa del seguro y se proponen y evalúan pautas alternativas para mejorar el diseño del contrato. Se aborda el riesgo base, el principal problema de los seguros indexados identificado en la literatura y, que está referido a la correlación imperfecta del índice con las pérdidas de la explotación. Para ello, se sigue un enfoque bayesiano que permite evaluar el impacto en el riesgo base de las pautas de diseño propuestas: i) una zonificación por clúster que considera aspectos espacio-temporales, ii) un período de garantía acotado a los ciclos fenológicos del pasto y iii) umbral de garantía. Los resultados muestran que tanto la zonificación como el periodo de garantía reducen el riesgo base considerablemente. Sin embargo, el umbral de garantía tiene un efecto ambiguo sobre el riesgo base. Por otra parte, la zonificación por clúster contribuye a aminorar el riesgo sistémico que enfrentan las aseguradoras. Estos resultados han puesto de manifiesto que un buen diseño de contrato puede tener un doble dividendo, por un lado aumentar su utilidad y, por otro, reducir el coste del seguro. Un diseño de contrato eficiente junto con los avances en la teledetección y un adecuado marco institucional son los pilares básicos para el buen funcionamiento de un programa de seguro. Las nuevas tecnologías ofrecen un importante potencial para la innovación en la gestión del riesgo climático. Los avances en este campo pueden proporcionar importantes beneficios sociales en los países en desarrollo y regiones vulnerables, donde las herramientas para gestionar eficazmente los riesgos sistémicos como la sequía pueden ser de gran ayuda para el desarrollo. The vulnerability of grazing livestock systems highlights the need for tools to assess and mitigate the adverse impact of drought. The recent and rapid progress in remote sensing has awakened an interest for tapping into potential applications, triggering intensive efforts to develop innovations in a number of spheres. One of these areas is climate risk management, where the use of vegetation indices facilitates assessment of drought. This research analyzes drought impacts and evaluates the potential of new technologies such as remote sensing to manage drought risk in extensive livestock systems. Three essays in drought risk management are developed to: (i) assess the economic impact of drought on a livestock farm in the Andalusian Dehesa, (ii) build drought vulnerability maps in Chilean grazing lands, and (iii) design and evaluate the potential of an index insurance policy to address the risk of drought in grazing lands in Coquimbo, Chile. In the first essay, a dynamic and stochastic farm model is designed combining climate, agronomic, socio-economic and ecological aspects to assess drought risk. The model is developed to simulate a representative livestock farm in the Dehesa of Andalusia for the time period 1999-2010. Burn analysis and MonteCarlo simulation methods are used to identify the significance of various risk sources at the farm. Most notably, early summer and early winter are identified as periods of peak risk. Moreover, there is a significant time lag between climate and economic risk and this later last longer than the former. It is shown that intensity, frequency and duration of the drought are three crucial attributes that shape the economic impact of drought. Sensitivity analysis is conducted to assess the sustainability of farm management strategies and demonstrates that lowering the stocking rate reduces farmer exposure to drought risk but entails a reduction in the expected gross margin. The second essay, mapping drought vulnerability in Chilean grazing lands, proposes and builds an index of economic risk (IRESP) that is replicable and simple to interpret. This methodology integrates risk factors and adaptation strategies to deliver information on Value at Risk, maximum expected losses at 5% significance level. Mapping IRESP provides evidence about spatial patterns and significant differences in drought vulnerability across Chilean grazing lands. Spatial autocorrelation measures reveal that systemic risk is considerably larger in the South as compared to Northern or Central Regions. Furthermore, it is shown that vulnerability is not necessarily correlated with climate risk and that adaptation strategies do matter. These results show that IRESP conveys relevant information and that vulnerability maps may be useful tools to assess policy design and decision-making in drought risk management. The third essay develops a stochastic model to estimate the actuarially fair premium and evaluates the potential of an indexed insurance policy to manage drought risk in Coquimbo, a relevant livestock farming region of Chile. Basis risk refers to the imperfect correlation of the index and farmer loses and is identified in the literature as a main limitation of index insurance. A Bayesian approach is proposed to assess the impact on basis risk of alternative guidelines in contract design: i) A cluster zoning that considers space-time aspects, ii) A guarantee period bounded to fit phenological cycles, and iii) the triggering index threshold. Results show that both the proposed zoning and guarantee period considerably reduces basis risk. However, the triggering index threshold has an ambiguous effect on basis risk. On the other hand, cluster zoning contributes to ameliorate systemic risk faced by the insurer. These results highlighted that adequate contract design is important and may result in double dividend. On the one hand, increasing farmers’ utility and, secondly, reducing the cost of insurance. An efficient contract design coupled with advances in remote sensing and an appropriate institutional framework are the basis for an efficient operation of an insurance program. The new technologies offer significant potential for innovation in climate risk managements. Progress in this field is capturing increasing attention and may provide important social gains in developing countries and vulnerable regions where the tools to efficiently manage systemic risks, such as drought, may be a means to foster development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background:Malignancies arising in the large bowel cause the second largest number of deaths from cancer in the Western World. Despite progresses made during the last decades, colorectal cancer remains one of the most frequent and deadly neoplasias in the western countries. Methods: A genomic study of human colorectal cancer has been carried out on a total of 31 tumoral samples, corresponding to different stages of the disease, and 33 non-tumoral samples. The study was carried out by hybridisation of the tumour samples against a reference pool of non-tumoral samples using Agilent Human 1A 60-mer oligo microarrays. The results obtained were validated by qRT-PCR. In the subsequent bioinformatics analysis, gene networks by means of Bayesian classifiers, variable selection and bootstrap resampling were built. The consensus among all the induced models produced a hierarchy of dependences and, thus, of variables. Results: After an exhaustive process of pre-processing to ensure data quality--lost values imputation, probes quality, data smoothing and intraclass variability filtering--the final dataset comprised a total of 8, 104 probes. Next, a supervised classification approach and data analysis was carried out to obtain the most relevant genes. Two of them are directly involved in cancer progression and in particular in colorectal cancer. Finally, a supervised classifier was induced to classify new unseen samples. Conclusions: We have developed a tentative model for the diagnosis of colorectal cancer based on a biomarker panel. Our results indicate that the gene profile described herein can discriminate between non-cancerous and cancerous samples with 94.45% accuracy using different supervised classifiers (AUC values in the range of 0.997 and 0.955).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducing cover crops (CC) interspersed with intensively fertilized crops in rotation has the potential to reduce nitrate leaching. This paper evaluates various strategies involving CC between maize and compares the economic and environmental results with respect to a typical maize?fallow rotation. The comparison is performed through stochastic (Monte-Carlo) simulation models of farms? profits using probability distribution functions (pdfs) of yield and N fertilizer saving fitted with data collected from various field trials and pdfs of crop prices and the cost of fertilizer fitted from statistical sources. Stochastic dominance relationships are obtained to rank the most profitable strategies from a farm financial perspective. A two-criterion comparison scheme is proposed to rank alternative strategies based on farm profit and nitrate leaching levels, taking the baseline scenario as the maize?fallow rotation. The results show that when CC biomass is sold as forage instead of keeping it in the soil, greater profit and less leaching of nitrates are achieved than in the baseline scenario. While the fertilizer saving will be lower if CC is sold than if it is kept in the soil, the revenue obtained from the sale of the CC compensates for the reduced fertilizer savings. The results show that CC would perhaps provide a double dividend of greater profit and reduced nitrate leaching in intensive irrigated cropping systems in Mediterranean regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to compare a number of state-of-the-art methods in airborne laser scan- ning (ALS) remote sensing with regards to their capacity to describe tree size inequality and other indi- cators related to forest structure. The indicators chosen were based on the analysis of the Lorenz curve: Gini coefficient ( GC ), Lorenz asymmetry ( LA ), the proportions of basal area ( BALM ) and stem density ( NSLM ) stocked above the mean quadratic diameter. Each method belonged to one of these estimation strategies: (A) estimating indicators directly; (B) estimating the whole Lorenz curve; or (C) estimating a complete tree list. Across these strategies, the most popular statistical methods for area-based approach (ABA) were used: regression, random forest (RF), and nearest neighbour imputation. The latter included distance metrics based on either RF (NN–RF) or most similar neighbour (MSN). In the case of tree list esti- mation, methods based on individual tree detection (ITD) and semi-ITD, both combined with MSN impu- tation, were also studied. The most accurate method was direct estimation by best subset regression, which obtained the lowest cross-validated coefficients of variation of their root mean squared error CV(RMSE) for most indicators: GC (16.80%), LA (8.76%), BALM (8.80%) and NSLM (14.60%). Similar figures [CV(RMSE) 16.09%, 10.49%, 10.93% and 14.07%, respectively] were obtained by MSN imputation of tree lists by ABA, a method that also showed a number of additional advantages, such as better distributing the residual variance along the predictive range. In light of our results, ITD approaches may be clearly inferior to ABA with regards to describing the structural properties related to tree size inequality in for- ested areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La intensa evolución tecnológica que está experimentando nuestra sociedad en las últimas décadas hace que se estén desarrollando continuamente nuevas tecnologías que proporcionan mejoras tanto en la calidad como en la seguridad del servicio, este es el caso del 4G. A día de hoy, en España, la cuarta generación de comunicaciones móviles se ve encabezada por LTE, mientras que LTE-Advanced sólo se está implantando en las principales ciudades de nuestro país durante los últimos meses. Por este motivo, se ha creído interesante realizar una planificación sobre una zona que, hasta el momento, no está cubierta por cobertura LTE-Advanced. Además hay que tener en cuenta la naturaleza del terreno en el que trabajaremos, ya que se aleja del suelo urbano que encontramos en las principales ciudades con LTE-Advanced, como Madrid, Barcelona o Valencia. El estudio de esta zona semirural es de gran interés ya que uno de los objetivos de la cuarta generación es hacer llegar conexión a internet de calidad a lugares en los que no puede llegar la fibra óptica, como por ejemplo estas zonas semirurales. Para añadir aún más interés en el estudio, se ha decidido utilizar la banda de 800 MHz para el despliegue de la red. Esta banda que anteriormente era utilizada para la transmisión TDT, recientemente ha quedado liberada, en el conocido como Dividendo Digital para su uso en comunicaciones móviles. La tecnología LTE-Advanced se está empezando a desplegar en esta banda aunque realmente hasta Noviembre del año 2015 no tendremos un uso real de la misma, por lo que en estos momentos las redes 4G están utilizando la banda de 2.6 GHz. La utilización de la banda de 800 MHz conllevará mejoras tanto al usuario como a las operadoras, las cuales iremos viendo a lo largo del desarrollo del proyecto. La planificación pasará por distintas fases de optimización y expansión en las que se analizaran tanto la parte radioeléctrica como su capacidad. Se analizaran señales del tipo RSRP, RSSI o RSRQ y para el análisis de capacidad se definirá un conjunto de usuarios, distribuidos adecuadamente por toda la zona, que permitirá estudiar en detalle la capacidad de nuestra red. Para finalizar, se realizarán varias pruebas que demostrarán lo importante que es la tecnología MIMO tanto en LTE como en LTE-Advanced. ABSTRACT. Nowadays, our society is experiencing an intense pace of technological evolution which causes the constant development of new technologies. In the network planning area, these new technologies are focused on improving both quality and safety of service, with the recent deployment of 4G technologies in our networks. This project focuses on Spain, where the fourth generation of mobile communications is led by LTE, because LTE-Advanced has only been deployed in the largest cities, so far. The goal of this project is to plan, deploy and simulate LTE-Advanced network, of an area that hasn´t yet been covered. Furthermore, it will be taken into account the nature of the terrain where the network will be developed, as it moves away from urban areas in the major cities with LTE-Advanced, including Madrid, Barcelona and Valencia. The study of these semi-rural areas is extremely important because one of the main objectives of the fourth generation technologies is to get high-speed internet access to places that can be reached through other technologies, such as optical fiber. In order to adjust to the actual needs, the project was developed for the 800 MHz band. Those frequencies used to be assigned for digital terrestrial TV, but they have recently been released through the Digital Dividend in 2015 to use with mobile communications. That is the reason why, the LTE-Advanced technology in Spain is starting to be deployed in those frequencies. Despite the freeing of the 800 MHz band, it is not allowed to use it until November 2015, so 4G networks are currently using the 2.6 GHz band. The use of the 800 MHz band will led to advantages and improvements to users and operators, which will be detailed over the project. Each step of the planning of the 4G network is detailed. It is analyzed the optimization and expansion of the network, based on the radio and capacity premises. RSRP, RSSI or RSRQ signals were analyzed and an analysis of the network capacity was carried out. Finally, several tests are developed to show the importance of MIMO in LTE and LTE-Advanced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El estándar LTE se ha posicionado como una de las claves para que los operadores de telecomunicación puedan abordar de manera eficiente en costes el crecimiento de la demanda de tráfico móvil que se prevé para los próximos años, al ser una tecnología más escalable en el núcleo de la red y más flexible en la interfaz radio que sus predecesoras. En este sentido, es necesario también que los reguladores garanticen un acceso al espectro radioeléctrico adecuado, equitativo y no discriminatorio, que permita un entorno estable para el despliegue de redes de comunicaciones móviles avanzadas. Además de la flexibilización del marco regulador del espectro radioeléctrico en Europa, que ha permitido el despliegue de nuevas tecnologías en las bandas de frecuencia históricas de GSM, se ha puesto a disposición espectro adicional para sistemas IMT en nuevas bandas de frecuencia, lo que ha planteando a su vez nuevos retos para la tecnología y la regulación. La fragmentación del espectro disponible para comunicaciones móviles ha impulsado el desarrollo de técnicas de agregación de portadoras en las nuevas versiones del estándar LTE, que permiten explotar mejor los recursos radio en su conjunto. No obstante, el espectro inferior a 1 GHz sigue siendo escaso, ya que el tráfico móvil aumenta y la banda de 900 MHz aún se utiliza para servicios GSM, lo que no ha conseguido sino agravar la disputa entre los servicios de radiodifusión terrestre y de comunicaciones móviles por la parte superior de la banda UHF. En concreto, la banda de 700 MHz se perfila como una de las próximas para aumentar el espectro disponible para los servicios en movilidad, si bien su liberación por parte de las actuales redes de Televisión Digital Terrestre presenta no pocas dificultades en los Estados miembros en los que ésta es la principal plataforma audiovisual de acceso gratuito, abriendo un debate sobre el modelo audiovisual a largo plazo en Europa. Por otro lado, las políticas públicas de promoción del acceso a la banda ancha rápida y ultrarrápida de la presente década han establecido objetivos ambiciosos para el año 2020, tanto en el ámbito europeo como en los diferentes Estados miembros. La universalización del acceso a redes de banda ancha de al menos 30 Mbps constituye uno de los principales retos. Las expectativas generadas por la tecnología LTE y la puesta a disposición de nuevas bandas de frecuencia hace posible que los servicios de acceso fijo inalámbrico adquieran especial relevancia ante los objetivos de política pública establecidos que, como ha sido reconocido en diversas ocasiones, no podrán lograrse sino con un compendio de diferente tecnologías. Para esta Tesis Doctoral se han desarrollado una serie modelos tecnoeconómicos con el objetivo de realizar un análisis prospectivo que evalúa tres casos de especial relevancia en el despliegue de redes LTE: en primer lugar, la valoración económica de la banda de 700 MHz; en segundo lugar, la evaluación de modelos de negocio y reducción de costes considerando tecnologías femtocelulares; y finalmente, la viabilidad de las redes LTE de acceso fijo inalámbrico para el cierre de la brecha digital en el acceso a la banda ancha de 30 Mbps. En relación con la aplicación del análisis tecnoeconómico para la valoración del espectro de 700 MHz, los resultados obtenidos ponen de manifiesto dos cuestiones fundamentales. En primer lugar, la necesidad de asignar a los operadores más espectro para satisfacer las previsiones de demanda de tráfico móvil a medio plazo. En segundo, existe una diferencia notable en los costes de despliegue de una red LTE cuando se dispone de espectro en frecuencias inferiores a 1 GHz y cuando no, pero esta diferencia de costes disminuye a medida que se añade nuevo espectro sub-1GHz. De esta manera, la atribución de la banda de 700 MHz a servicios de comunicaciones móviles supone una reducción relevante en los costes de despliegue si el operador no dispone de espectro en la banda de 800 MHz, pero no así si ya dispone de espectro en bandas bajas para el despliegue. En este sentido, puede concluirse que el precio que los operadores estarán dispuestos a pagar por el espectro de la banda de 700 MHz dependerá de si ya tienen disponible espectro en la banda de 800 MHz. Sin embargo, dado que la competencia por ese espectro será menor, los ingresos esperables en las licitaciones de esta nueva banda serán en general menores, a pesar de que para algunos operadores este espectro sería tan valioso como el de 800 MHz. En segundo lugar, en relación con el despliegue de femtoceldas pueden extraerse algunas conclusiones en términos de ahorro de costes de despliegue y también de cara a la viabilidad de los modelos de negocio que posibilitan. El ahorro que supone la introducción de femtoceldas en el despliegue de una red LTE frente al caso de un despliegue exclusivamente macrocelular se ha demostrado que es mayor cuanto menor es el ancho de banda disponible para la red macrocelular. En esta línea, para un operador convergente el despliegue de femtoceldas tiene sentido económico si el ancho de banda disponible es escaso (en torno a 2x10 MHz), que, en el caso de España, puede reflejar el caso de los operadores del segmento fijo que son nuevos entrantes en el móvil. Por otro lado, los modelos de acceso abierto son interesantes para operadores exclusivamente móviles, porque consiguen flexibilizar los costes sustituyendo estaciones base macrocelulares por el despliegue de femtoceldas, pero necesitan desplegarse en zonas con una densidad de población relativamente elevada para que éstas descarguen tráfico de varios usuarios de la red macrocelular simultáneamente. No obstante, las femtoceldas son beneficiosas en todo caso si es el usuario quien asume los costes de la femtocelda y el backhaul, lo que sólo parece probable si se integran en el modelo de negocio de comercialización de nuevos servicios. Por tanto, el despliegue de femtoceldas en buena parte de la casuística estudiada sólo tiene sentido si consiguen aumentar los ingresos por usuario comercializando servicios de valor añadido que necesiten calidad de servicio garantizada y exploten a la vez de esa forma su principal ventaja competitiva respecto a la tecnología WiFi. Finalmente, en relación con el papel de la tecnología LTE para la provisión de servicios de acceso fijo inalámbrico para la banda ancha de 30 Mbps, se ha desarrollado un modelo TD-LTE y mediante la metodología de análisis tecnoeconómico se ha realizado un estudio prospectivo para el caso de España. Los resultados obtenidos preciden una huella de cobertura de FTTH del 74% para 2020, y demuestran que una red TD-LTE en la banda de 3,5 GHz resulta viable para aumentar la cobertura de servicios de 30 Mbps en 14 puntos porcentuales. Junto con la consideración de la cobertura de otras redes, la cobertura de 30 Mbps de acuerdo a la viabilidad de los despliegues alcanzaría el 95% en España en el año 2020. Como resumen, los resultados obtenidos muestran en todos los casos la capacidad de la tecnología LTE para afrontar nuevos desafíos en relación con el aumento del tráfico móvil, especialmente crítico en las zonas más urbanas, y el cierre de la brecha digital en el acceso a la banda ancha rápida en las zonas más rurales. ABSTRACT The LTE standard has been pointed out as one of the keys for telecom operators to address the demand growth in mobile traffic foreseen for the next years in a cost-efficient way, since its core network is more scalable and its radio interface more flexible than those of its predecessor technologies. On the other hand, regulators need to guarantee an adequate, equitable and non-discriminatory access to radio spectrum, which enable a favorable environment for the deployment of advanced mobile communication networks. Despite the reform of the spectrum regulatory framework in Europe, which allowed for the deployment of new technologies in the historic GSM bands, additional spectrum has been allocated to IMT systems in new frequency bands, what in turn has set out new challenges for technology and regulation. The current fragmentation of available spectrum in very different frequency bands has boosted the development of carrier aggregation techniques in most recent releases of the LTE standard, which permit a better exploitation of radio resources as a whole. Nonetheless, spectrum below 1 GHz is still scarce for mobile networks, since mobile traffic increases at a more rapid pace than spectral efficiency and spectrum resources. The 900 MHz frequency band is still being used for GSM services, what has worsen the dispute between mobile communication services and terrestrial broadcasting services for the upper part of the UHF band. Concretely, the 700 MHz frequency band has been pointed out as one of the next bands to be allocated to mobile in order to increase available spectrum. However, its release by current Digital Terrestrial Television networks is challenging in Member States where it constitutes the main free access audiovisual platform, opening up a new debate around the audiovisual model in the long term in Europe. On the other hand, public policies of the present decade to promote fast and ultrafast broadband access has established very ambitious objectives for the year 2020, both at European and national levels. Universalization of 30 Mbps broadband access networks constitutes one of the main challenges. Expectations raised by LTE technology and the allocation of new frequency bands has lead fixed wireless access (FWA) services to acquire special relevance in light of public policy objectives, which will not be met but with a compendium of different technologies, as different involved stakeholders have acknowledged. This PhD Dissertation develops techno-economic models to carry out a prospective analysis for three cases of special relevance in LTE networks’ deployment: the spectrum pricing of the 700 MHz frequency band, an assessment of new business models and cost reduction considering femtocell technologies, and the feasibility of LTE fixed wireless access networks to close the 30 Mbps broadband access gap in rural areas. In the first place and regarding the application of techno-economic analysis for 700 MHz spectrum pricing, obtained results reveal two core issues. First of all, the need to allocate more spectrum for operators in order to fulfill mobile traffic demand in the mid-term. Secondly, there is a substantial difference in deployment costs for a LTE network when there is sub-1GHz spectrum available and when there is not, but this difference decreases as additional sub-1GHz spectrum is added. Thus, the allocation of 700 MHz band to mobile communication services would cause a relevant reduction in deployment costs if the operator does not count on spectrum in the 800 MHz, but not if it already has been assigned spectrum in low frequencies for the deployment. In this regard, the price operators will be willing to pay for 700 MHz spectrum will depend on them having already spectrum in the 800 MHz frequency band or not. However, since competition for the new spectrum will not be so strong, expected incomes from 700 MHz spectrum awards will be generally lower than those from the digital dividend, despite this spectrum being as valuable as 800 MHz spectrum for some operators. In the second place, regarding femtocell deployment, some conclusions can be drawn in terms of deployment cost savings and also with reference to the business model they enable. Savings provided by a joint macro-femto LTE network as compared to an exclusively macrocellular deployment increase as the available bandwidth for the macrocells decreases. Therefore, for a convergent operator the deployment of femtocells can only have economic sense if the available bandwidth is scarce (around 2x10 MHz), which might be the case of fix market operators which are new entrant in mobile market. Besides, open access models are interesting for exclusively mobile operators, since they make costs more flexible by substituting macrocell base stations by femtocells, but they need to be deployed relatively densely populated areas so that they can offload traffic from several macrocell users simultaneously. Nonetheless, femtocells are beneficial in all cases if the user assumes both femtocell and backhaul costs, which only seems probable if they are integrated in a business model commercializing new services. Therefore, in many of the cases analyzed femtocell deployment only makes sense if they increase revenues per user through new added value services which need from guaranteed quality of service, thus exploiting its main competitive advantage compared to WiFi. Finally, regarding the role of LTE technology in the provision of fixed wireless access services for 30 Mbps broadband, a TD-LTE model has been developed and a prospective study has been carried out through techno-economic methodology for the Spanish case. Obtained results foresee a FTTH coverage footprint of 74% households for 2020, and prove that a TD-LTE network in the 3.5 GHz band results feasible to increase 30 Mbps service coverage in additional 14 percentage points. To sum up, obtained results show LTE technology capability to address new challenges regarding both mobile traffic growth, particularly critical in urban zones, and the current digital divide in fast broadband access in most rural zones.