159 resultados para multivariate classification
Resumo:
We examined variability in hierarchical beta diversity across ecosystems, geographical gradients, and organism groups using multivariate spatial mixed modeling analysis of two independent data sets. The larger data set comprised reported ratios of regional species richness (RSR) to local species richness (LSR) and the second data set consisted of RSR: LSR ratios derived from nested species-area relationships. There was a negative, albeit relatively weak, relationship between beta diversity and latitude. We found only relatively subtle differences in beta diversity among the realms, yet beta diversity was lower in marine systems than in terrestrial or freshwater realms. Beta diversity varied significantly among organisms' major characteristics such as body mass, trophic position, and dispersal type in the larger data set. Organisms that disperse via seeds had highest beta diversity, and passively dispersed organisms showed the lowest beta diversity. Furthermore, autotrophs had lower beta diversity than organisms higher up the food web; omnivores and carnivores had consistently higher beta diversity. This is evidence that beta diversity is simultaneously controlled by extrinsic factors related to geography and environment, and by intrinsic factors related to organism characteristics.
Resumo:
The statistical properties of the multivariate GammaGamma (ΓΓ) distribution with arbitrary correlation have remained unknown. In this paper, we provide analytical expressions for the joint probability density function (PDF), cumulative distribution function (CDF) and moment generation function of the multivariate ΓΓ distribution with arbitrary correlation. Furthermore, we present novel approximating expressions for the PDF and CDF of the su m of ΓΓ random variables with arbitrary correlation. Based on this statistical analysis, we investigate the performance of radio frequency and optical wireless communication systems. It is noteworthy that the presented expressions include several previous results in the literature as special cases.
Resumo:
The applicability of ultra-short-term wind power prediction (USTWPP) models is reviewed. The USTWPP method proposed extracts featrues from historical data of wind power time series (WPTS), and classifies every short WPTS into one of several different subsets well defined by stationary patterns. All the WPTS that cannot match any one of the stationary patterns are sorted into the subset of nonstationary pattern. Every above WPTS subset needs a USTWPP model specially optimized for it offline. For on-line application, the pattern of the last short WPTS is recognized, then the corresponding prediction model is called for USTWPP. The validity of the proposed method is verified by simulations.
Resumo:
Breast cancer remains a frequent cause of female cancer death despite the great strides in elucidation of biological subtypes and their reported clinical and prognostic significance. We have defined a general cohort of breast cancers in terms of putative actionable targets, involving growth and proliferative factors, the cell cycle, and apoptotic pathways, both as single biomarkers across a general cohort and within intrinsic molecular subtypes.
We identified 293 patients treated with adjuvant chemotherapy. Additional hormonal therapy and trastuzumab was administered depending on hormonal and HER2 status respectively. We performed immunohistochemistry for ER, PR, HER2, MM1, CK5/6, p53, TOP2A, EGFR, IGF1R, PTEN, p-mTOR and e-cadherin. The cohort was classified into luminal (62%) and non-luminal (38%) tumors as well as luminal A (27%), luminal B HER2 negative (22%) and positive (12%), HER2 enriched (14%) and triple negative (25%). Patients with luminal tumors and co-overexpression of TOP2A or IGF1R loss displayed worse overall survival (p=0.0251 and p=0.0008 respectively). Non-luminal tumors had much greater heterogeneous expression profiles with no individual markers of prognostic significance. Non-luminal tumors were characterised by EGFR and TOP2A overexpression, IGF1R, PTEN and p-mTOR negativity and extreme p53 expression.
Our results indicate that only a minority of intrinsic subtype tumors purely express single novel actionable targets. This lack of pure biomarker expression is particular prevalent in the triple negative subgroup and may allude to the mechanism of targeted therapy inaction and myriad disappointing trial results. Utilising a combinatorial biomarker approach may enhance studies of targeted therapies providing additional information during design and patient selection while also helping decipher negative trial results.
Resumo:
Mobile malware has been growing in scale and complexity as smartphone usage continues to rise. Android has surpassed other mobile platforms as the most popular whilst also witnessing a dramatic increase in malware targeting the platform. A worrying trend that is emerging is the increasing sophistication of Android malware to evade detection by traditional signature-based scanners. As such, Android app marketplaces remain at risk of hosting malicious apps that could evade detection before being downloaded by unsuspecting users. Hence, in this paper we present an effective approach to alleviate this problem based on Bayesian classification models obtained from static code analysis. The models are built from a collection of code and app characteristics that provide indicators of potential malicious activities. The models are evaluated with real malware samples in the wild and results of experiments are presented to demonstrate the effectiveness of the proposed approach.
Resumo:
Slow release drugs must be manufactured to meet target specifications with respect to dissolution curve profiles. In this paper we consider the problem of identifying the drivers of dissolution curve variability of a drug from historical manufacturing data. Several data sources are considered: raw material parameters, coating data, loss on drying and pellet size statistics. The methodology employed is to develop predictive models using LASSO, a powerful machine learning algorithm for regression with high-dimensional datasets. LASSO provides sparse solutions facilitating the identification of the most important causes of variability in the drug fabrication process. The proposed methodology is illustrated using manufacturing data for a slow release drug.
Resumo:
Biodegradable polymers, such as PLA (Polylactide), come from renewable resources like corn starch and if disposed of correctly, degrade and become harmless to the ecosystem making them attractive alternatives to petroleum based polymers. PLA in particular is used in a variety of applications including medical devices, food packaging and waste disposal packaging. However, the industry faces challenges in melt processing of PLA due to its poor thermal stability which is influenced by processing temperatures and shearing.
Identification and control of suitable processing conditions is extremely challenging, usually relying on trial and error, and often sensitive to batch to batch variations. Off-line assessment in a lab environment can result in high scrap rates, long lead times and lengthy and expensive process development. Scrap rates are typically in the region of 25-30% for medical grade PLA costing between €2000-€5000/kg.
Additives are used to enhance material properties such as mechanical properties and may also have a therapeutic role in the case of bioresorbable medical devices, for example the release of calcium from orthopaedic implants such as fixation screws promotes healing. Additives can also reduce the costs involved as less of the polymer resin is required.
This study investigates the scope for monitoring, modelling and optimising processing conditions for twin screw extrusion of PLA and PLA w/calcium carbonate to achieve desired material properties. A DAQ system has been constructed to gather data from a bespoke measurement die comprising melt temperature; pressure drop along the length of the die; and UV-Vis spectral data which is shown to correlate to filler dispersion. Trials were carried out under a range of processing conditions using a Design of Experiments approach and samples were tested for mechanical properties, degradation rate and the release rate of calcium. Relationships between recorded process data and material characterisation results are explored.
Resumo:
The Magellanic Clouds are uniquely placed to study the stellar contribution to dust emission. Individual stars can be resolved in these systems even in the mid-infrared, and they are close enough to allow detection of infrared excess caused by dust. We have searched the Spitzer Space Telescope data archive for all Infrared Spectrograph (IRS) staring-mode observations of the Small Magellanic Cloud (SMC) and found that 209 Infrared Array Camera (IRAC) point sources within the footprint of the Surveying the Agents of Galaxy Evolution in the Small Magellanic Cloud (SAGE-SMC) Spitzer Legacy programme were targeted, within a total of 311 staring-mode observations. We classify these point sources using a decision tree method of object classification, based on infrared spectral features, continuum and spectral energy distribution shape, bolometric luminosity, cluster membership and variability information. We find 58 asymptotic giant branch (AGB) stars, 51 young stellar objects, 4 post-AGB objects, 22 red supergiants, 27 stars (of which 23 are dusty OB stars), 24 planetary nebulae (PNe), 10 Wolf-Rayet stars, 3 H II regions, 3 R Coronae Borealis stars, 1 Blue Supergiant and 6 other objects, including 2 foreground AGB stars. We use these classifications to evaluate the success of photometric classification methods reported in the literature.
Resumo:
Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.