439 resultados para R-Statistical computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high priority of monitoring workers exposed to nitrobenzene is a consequence of clear findings of experimental carcinogenicity of nitrobenzene and the associated evaluations by the International Agency for Research on Cancer. Eighty male employees of a nitrobenzene reduction plant, with potential skin contact with nitrobenzene and aniline, participated in a current medical surveillance programme. Blood samples were routinely taken and analysed for aniline, 4-aminodiphenyl (4-ADP) and benzidine adducts of haemoglobin (Hb) and human serum albumin (HSA). Also, levels of methaemoglobin (Met-Hb) and of carbon monoxide haemoglobin (CO-Hb) were monitored. Effects of smoking were straightforward. Using the rank sum test of Wilcoxon, we found that very clear-cut and statistically significant smoking effects (about 3-fold increases) were apparent on CO-Hb (P = 0.00085) and on the Hb adduct of 4-ADP (P = 0.0006). The mean aniline-Hb adduct level in smokers was 1.5 times higher than in non-smokers; the significance (P = 0.05375) was close to the 5% level. The strongest correlation was evident between the Hb and HSA adducts of aniline (rs = 0.846). Less pronounced correlations (but with P values < 0.02) appeared between aniline-Hb and 4-ADP-Hb adducts (rs = 0.388), between 4-ADP and 4-ADP-HSA adducts (rs = 0.373), and between 4-ADP-Hb and aniline-HSA adducts (rs = 0.275). In view of the proposal for additional use of the aniline-HSA adduct for biological monitoring, particularly in cases of acute overexposures or poisonings, the strong correlation of the Hb and HSA conjugates is noteworthy; the ratio aniline-HSA:aniline-Hb was 1:42 for the entire cohort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Professor Peter Barrett at the 2013 CIB World Building Congress1 (WBC13) presented a timely context for the future of research and development (R&D) investment in the global construction industry (Barrett, 2013). He called for a shift in the focus from lessons learned and doing things better to what is the right thing to do and developing a new paradigm for achieving this. This shift requires empathy with industry and users; a desire to generate and transmit knowledge; an opportunity to study deeply and over the long term; and with an objective stance towards fJositive and negative findings. This shift includes the creation of sta11dards for the holistic impact of spaces through exemplary pilot projects creating evidence for policy makers and clients (Barrett, 2013)...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Posttraumatic stress disorder (PTSD) is a complex syndrome that occurs following exposure to a potentially life threatening traumatic event. This review summarises the literature on the genetics of PTSD including gene–environment interactions (GxE), epigenetics and genetics of treatment response. Numerous genes have been shown to be associated with PTSD using candidate gene approaches. Genome-wide association studies have been limited due to the large sample size required to reach statistical power. Studies have shown that GxE interactions are important for PTSD susceptibility. Epigenetics plays an important role in PTSD susceptibility and some of the most promising studies show stress and child abuse trigger epigenetic changes. Much of the molecular genetics of PTSD remains to be elucidated. However, it is clear that identifying genetic markers and environmental triggers has the potential to advance early PTSD diagnosis and therapeutic interventions and ultimately ease the personal and financial burden of this debilitating disorder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few would disagree that the upstream oil & gas industry has become more technology-intensive over the years. But how does innovation happen in the industry? Specifically, what ideas and inputs flow from which parts of the sector׳s value network, and where do these inputs go? And how do firms and organizations from different countries contribute differently to this process? This paper puts forward the results of a survey designed to shed light on these questions. Carried out in collaboration with the Society of Petroleum Engineers (SPE), the survey was sent to 469 executives and senior managers who played a significant role with regard to R&D and/or technology deployment in their respective business units. A total of 199 responses were received from a broad range of organizations and countries around the world. Several interesting themes and trends emerge from the results, including: (1) service companies tend to file considerably more patents per innovation than other types of organization; (2) over 63% of the deployed innovations reported in the survey originated in service companies; (3) neither universities nor government-led research organizations were considered to be valuable sources of new information and knowledge in the industry׳s R&D initiatives, and; (4) despite the increasing degree of globalization in the marketplace, the USA still plays an extremely dominant role in the industry׳s overall R&D and technology deployment activities. By providing a detailed and objective snapshot of how innovation happens in the upstream oil & gas sector, this paper provides a valuable foundation for future investigations and discussions aimed at improving how R&D and technology deployment are managed within the industry. The methodology did result in a coverage bias within the survey, however, and the limitations arising from this are explored.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report presents the final deliverable from the project titled Conceptual and statistical framework for a water quality component of an integrated report card’ funded by the Marine and Tropical Sciences Research Facility (MTSRF; Project 3.7.7). The key management driver of this, and a number of other MTSRF projects concerned with indicator development, is the requirement for state and federal government authorities and other stakeholders to provide robust assessments of the present ‘state’ or ‘health’ of regional ecosystems in the Great Barrier Reef (GBR) catchments and adjacent marine waters. An integrated report card format, that encompasses both biophysical and socioeconomic factors, is an appropriate framework through which to deliver these assessments and meet a variety of reporting requirements. It is now well recognised that a ‘report card’ format for environmental reporting is very effective for community and stakeholder communication and engagement, and can be a key driver in galvanising community and political commitment and action. Although a report card it needs to be understandable by all levels of the community, it also needs to be underpinned by sound, quality-assured science. In this regard this project was to develop approaches to address the statistical issues that arise from amalgamation or integration of sets of discrete indicators into a final score or assessment of the state of the system. In brief, the two main issues are (1) selecting, measuring and interpreting specific indicators that vary both in space and time, and (2) integrating a range of indicators in such a way as to provide a succinct but robust overview of the state of the system. Although there is considerable research and knowledge of the use of indicators to inform the management of ecological, social and economic systems, methods on how to best to integrate multiple disparate indicators remain poorly developed. Therefore the objective of this project was to (i) focus on statistical approaches aimed at ensuring that estimates of individual indicators are as robust as possible, and (ii) present methods that can be used to report on the overall state of the system by integrating estimates of individual indicators. It was agreed at the outset, that this project was to focus on developing methods for a water quality report card. This was driven largely by the requirements of Reef Water Quality Protection Plan (RWQPP) and led to strong partner engagement with the Reef Water Quality Partnership.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE The purpose of this study was to demonstrate the potential of near infrared (NIR) spectroscopy for characterizing the health and degenerative state of articular cartilage based on the components of the Mankin score. METHODS Three models of osteoarthritic degeneration induced in laboratory rats by anterior cruciate ligament (ACL) transection, meniscectomy (MSX), and intra-articular injection of monoiodoacetate (1 mg) (MIA) were used in this study. Degeneration was induced in the right knee joint; each model group consisted of 12 rats (N = 36). After 8 weeks, the animals were euthanized and knee joints were collected. A custom-made diffuse reflectance NIR probe of 5-mm diameter was placed on the tibial and femoral surfaces, and spectral data were acquired from each specimen in the wave number range of 4,000 to 12,500 cm(-1). After spectral data acquisition, the specimens were fixed and safranin O staining (SOS) was performed to assess disease severity based on the Mankin scoring system. Using multivariate statistical analysis, with spectral preprocessing and wavelength selection technique, the spectral data were then correlated to the structural integrity (SI), cellularity (CEL), and matrix staining (SOS) components of the Mankin score for all the samples tested. RESULTS ACL models showed mild cartilage degeneration, MSX models had moderate degeneration, and MIA models showed severe cartilage degenerative changes both morphologically and histologically. Our results reveal significant linear correlations between the NIR absorption spectra and SI (R(2) = 94.78%), CEL (R(2) = 88.03%), and SOS (R(2) = 96.39%) parameters of all samples in the models. In addition, clustering of the samples according to their level of degeneration, with respect to the Mankin components, was also observed. CONCLUSIONS NIR spectroscopic probing of articular cartilage can potentially provide critical information about the health of articular cartilage matrix in early and advanced stages of osteoarthritis (OA). CLINICAL RELEVANCE This rapid nondestructive method can facilitate clinical appraisal of articular cartilage integrity during arthroscopic surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed computation and storage have been widely used for processing of big data sets. For many big data problems, with the size of data growing rapidly, the distribution of computing tasks and related data can affect the performance of the computing system greatly. In this paper, a distributed computing framework is presented for high performance computing of All-to-All Comparison Problems. A data distribution strategy is embedded in the framework for reduced storage space and balanced computing load. Experiments are conducted to demonstrate the effectiveness of the developed approach. They have shown that about 88% of the ideal performance capacity have be achieved in multiple machines through using the approach presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impaired driver alertness increases the likelihood of drivers’ making mistakes and reacting too late to unexpected events while driving. This is particularly a concern on monotonous roads, where a driver’s attention can decrease rapidly. While effective countermeasures do not currently exist, the development of in-vehicle sensors opens avenues for monitoring driving behavior in real-time. The aim of this study is to predict drivers’ level of alertness through surrogate measures collected from in-vehicle sensors. Electroencephalographic activity is used as a reference to evaluate alertness. Based on a sample of 25 drivers, data was collected in a driving simulator instrumented with an eye tracking system, a heart rate monitor and an electrodermal activity device. Various classification models were tested from linear regressions to Bayesians and data mining techniques. Results indicated that Neural Networks were the most efficient model in detecting lapses in alertness. Findings also show that reduced alertness can be predicted up to 5 minutes in advance with 90% accuracy, using surrogate measures such as time to line crossing, blink frequency and skin conductance level. Such a method could be used to warn drivers of their alertness level through the development of an in-vehicle device monitoring, in real-time, drivers' behavior on highways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fine-grained leaf classification has concentrated on the use of traditional shape and statistical features to classify ideal images. In this paper we evaluate the effectiveness of traditional hand-crafted features and propose the use of deep convolutional neural network (ConvNet) features. We introduce a range of condition variations to explore the robustness of these features, including: translation, scaling, rotation, shading and occlusion. Evaluations on the Flavia dataset demonstrate that in ideal imaging conditions, combining traditional and ConvNet features yields state-of-theart performance with an average accuracy of 97:3%�0:6% compared to traditional features which obtain an average accuracy of 91:2%�1:6%. Further experiments show that this combined classification approach consistently outperforms the best set of traditional features by an average of 5:7% for all of the evaluated condition variations.