825 resultados para Conal Ryan
Resumo:
We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerative distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicate for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c-myc expression level, is subject to measurement error.
Resumo:
Generalized linear mixed models (GLMMs) provide an elegant framework for the analysis of correlated data. Due to the non-closed form of the likelihood, GLMMs are often fit by computational procedures like penalized quasi-likelihood (PQL). Special cases of these models are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints often make it difficult to apply these iterative procedures to data sets with very large number of cases. This paper proposes a computationally efficient strategy based on the Gauss-Seidel algorithm that iteratively fits sub-models of the GLMM to subsetted versions of the data. Additional gains in efficiency are achieved for Poisson models, commonly used in disease mapping problems, because of their special collapsibility property which allows data reduction through summaries. Convergence of the proposed iterative procedure is guaranteed for canonical link functions. The strategy is applied to investigate the relationship between ischemic heart disease, socioeconomic status and age/gender category in New South Wales, Australia, based on outcome data consisting of approximately 33 million records. A simulation study demonstrates the algorithm's reliability in analyzing a data set with 12 million records for a (non-collapsible) logistic regression model.
Resumo:
Question and answer session with presenters Daniel Schneider, Ryan Tate, and Brendan Pelto.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
The Collingwood Member is a mid to late Ordovician self-sourced reservoir deposited across the northern Michigan Basin and parts of Ontario, Canada. Although it had been previously studied in Canada, there has been relatively little data available from the Michigan subsurface. Recent commercial interest in the Collingwood has resulted in the drilling and production of several wells in the state of Michigan. An analysis of core samples, measured laboratory data, and petrophysical logs has yielded both a quantitative and qualitative understanding of the formation in the Michigan Basin. The Collingwood is a low permeability and low porosity carbonate package that is very high in organic content. It is composed primarily of a uniformly fine grained carbonate matrix with lesser amounts of kerogen, silica, and clays. The kerogen content of the Collingwood is finely dispersed in the clay and carbonate mineral phases. Geochemical and production data show that both oil and gas phases are present based on regional thermal maturity. The deposit is richest in the north-central part of the basin with thickest deposition and highest organic content. The Collingwood is a fairly thin deposit and vertical fractures may very easily extend into the surrounding formations. Completion and treatment techniques should be designed around these parameters to enhance production.
Resumo:
Routine bridge inspections require labor intensive and highly subjective visual interpretation to determine bridge deck surface condition. Light Detection and Ranging (LiDAR) a relatively new class of survey instrument has become a popular and increasingly used technology for providing as-built and inventory data in civil applications. While an increasing number of private and governmental agencies possess terrestrial and mobile LiDAR systems, an understanding of the technology’s capabilities and potential applications continues to evolve. LiDAR is a line-of-sight instrument and as such, care must be taken when establishing scan locations and resolution to allow the capture of data at an adequate resolution for defining features that contribute to the analysis of bridge deck surface condition. Information such as the location, area, and volume of spalling on deck surfaces, undersides, and support columns can be derived from properly collected LiDAR point clouds. The LiDAR point clouds contain information that can provide quantitative surface condition information, resulting in more accurate structural health monitoring. LiDAR scans were collected at three study bridges, each of which displayed a varying degree of degradation. A variety of commercially available analysis tools and an independently developed algorithm written in ArcGIS Python (ArcPy) were used to locate and quantify surface defects such as location, volume, and area of spalls. The results were visual and numerically displayed in a user-friendly web-based decision support tool integrating prior bridge condition metrics for comparison. LiDAR data processing procedures along with strengths and limitations of point clouds for defining features useful for assessing bridge deck condition are discussed. Point cloud density and incidence angle are two attributes that must be managed carefully to ensure data collected are of high quality and useful for bridge condition evaluation. When collected properly to ensure effective evaluation of bridge surface condition, LiDAR data can be analyzed to provide a useful data set from which to derive bridge deck condition information.
Resumo:
The U.S. natural gas industry has changed because of the recent ability to produce natural gas from unconventional shale deposits. One of the largest and most important deposits is the Marcellus Shale. Hydraulic fracturing and horizontal drilling have allowed for the technical feasibility of production, but concerns exist regarding the economics of shale gas production. These concerns are related to limited production and economic data for shale gas wells, declines in the rates of production, falling natural gas prices, oversupply issues coupled with slow growth in U.S. natural gas demand, and rising production costs. An attempt to determine profitability was done through the economic analysis of an average shale gas well using data that is representative of natural gas production from 2009 to 2011 in the Marcellus Shale. Despite the adverse conditions facing the shale gas industry it is concluded from the results of this analysis that a shale gas well in the Marcellus Shale is profitable based on NPV, IRR and breakeven price calculations.
Resumo:
Rainwater harvesting (RWH) has a long history and has been supported as an appropriate technology and relatively cheap source of domestic water supply. This study compares the suitability of RWH and piped water systems in three rural Dominican communities seeking to improve their water systems. Ethnographic methods considering the views of residents and feasibility and cost analysis of the options were used to conclude that RWH is not a feasible or cost-effective solution for domestic water needs of all households in the communities studied. RWH investment is best left to individual households that can implement informal RWH with incremental increases in storage volume. Piped water distribution (PWD) systems perceived as too large or expensive to implement have much lower capital costs and are more supported by residents as a solution because they provide large quantities of water needed to maintain water services beyond mere survival levels.
Resumo:
In the Dominican Republic economic growth in the past twenty years has not yielded sufficient improvement in access to drinking water services, especially in rural areas where 1.5 million people do not have access to an improved water source (WHO, 2006). Worldwide, strategic development planning in the rural water sector has focused on participatory processes and the use of demand filters to ensure that service levels match community commitment to post-project operation and maintenance. However studies have concluded that an alarmingly high percentage of drinking water systems (20-50%) do not provide service at the design levels and/or fail altogether (up to 90%): BNWP (2009), Annis (2006), and Reents (2003). World Bank, USAID, NGOs, and private consultants have invested significant resources in an effort to determine what components make up an “enabling environment” for sustainable community management of rural water systems (RWS). Research has identified an array of critical factors, internal and external to the community, which affect long term sustainability of water services. Different frameworks have been proposed in order to better understand the linkages between individual factors and sustainability of service. This research proposes a Sustainability Analysis Tool to evaluate the sustainability of RWS, adapted from previous relevant work in the field to reflect the realities in the Dominican Republic. It can be used as a diagnostic tool for government entities and development organizations to characterize the needs of specific communities and identify weaknesses in existing training regimes or support mechanisms. The framework utilizes eight indicators in three categories (Organization/Management, Financial Administration, and Technical Service). Nineteen independent variables are measured resulting in a score of sustainability likely (SL), possible (SP), or unlikely (SU) for each of the eight indicators. Thresholds are based upon benchmarks from the DR and around the world, primary data collected during the research, and the author’s 32 months of field experience. A final sustainability score is calculated using weighting factors for each indicator, derived from Lockwood (2003). The framework was tested using a statistically representative geographically stratified random sample of 61 water systems built in the DR by initiatives of the National Institute of Potable Water (INAPA) and Peace Corps. The results concluded that 23% of sample systems are likely to be sustainable in the long term, 59% are possibly sustainable, and for 18% it is unlikely that the community will be able to overcome any significant challenge. Communities that were scored as unlikely sustainable perform poorly in participation, financial durability, and governance while the highest scores were for system function and repair service. The Sustainability Analysis Tool results are verified by INAPA and PC reports, evaluations, and database information, as well as, field observations and primary data collected during the surveys. Future research will analyze the nature and magnitude of relationships between key factors and the sustainability score defined by the tool. Factors include: gender participation, legal status of water committees, plumber/operator remuneration, demand responsiveness, post construction support methodologies, and project design criteria.
Resumo:
Fish behaviourists are increasingly turning to non-invasive measurement of steroid hormones in holding water, as opposed to blood plasma. When some of us met at a workshop in Faro, Portugal, in September, 2007, we realised that there were still many issues concerning the application of this procedure that needed resolution, including: Why do we measure release rates rather than just concentrations of steroids in the water? How does one interpret steroid release rates when dealing with fish of different sizes? What are the merits of measuring conjugated as well as free steroids in water? In the ‘static’ sampling procedure, where fish are placed in a separate container for a short period of time, does this affect steroid release—and, if so, how can it be minimised? After exposing a fish to a behavioural stimulus, when is the optimal time to sample? What is the minimum amount of validation when applying the procedure to a new species? The purpose of this review is to attempt to answer these questions and, in doing so, to emphasize that application of the non-invasive procedure requires more planning and validation than conventional plasma sampling. However, we consider that the rewards justify the extra effort.