595 resultados para methods: analytical


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a new research method supplementing the existing qualitative and quantitative approaches, agent-based modelling and simulation (ABMS) may fit well within the entrepreneurship field because the core concepts and basic premises of entrepreneurship coincide with the characteristics of ABMS (McKelvey, 2004; Yang & Chandra, 2013). Agent-based simulation is a simulation method based on agent-based models. The agentbased models are composed of heterogeneous agents and their behavioural rules. By repeatedly carrying out agent-based simulations on a computer, the simulations reproduce each agent’s behaviour, their interactive process, and the emerging macroscopic phenomenon according to the flow of time. Using agent-based simulations, researchers may investigate temporal or dynamic effects of each agent’s behaviours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In various embodiments, optoelectronic devices are described herein. The optoelectronic device may include an optoelectronic cell arranged so as to wrap around a central axis wherein the cell includes a first conductive layer, a semi-conductive layer disposed over and in electrical communication with the first conductive layer, and a second conductive layer disposed over and in electrical communication with the semi-conductive layer. In various embodiments, methods for making optoelectronic devices are described herein. The methods may include forming an optoelectronic cell while flat and wrapping the optoelectronic cell around a central axis. The optoelectronic devices may be photovoltaic devices. Alternatively, the optoelectronic devices may be organic light emitting diodes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce the Stochastic Adams-Bashforth (SAB) and Stochastic Adams-Moulton (SAM) methods as an extension of the tau-leaping framework to past information. Using the theta-trapezoidal tau-leap method of weak order two as a starting procedure, we show that the k-step SAB method with k >= 3 is order three in the mean and correlation, while a predictor-corrector implementation of the SAM method is weak order three in the mean but only order one in the correlation. These convergence results have been derived analytically for linear problems and successfully tested numerically for both linear and non-linear systems. A series of additional examples have been implemented in order to demonstrate the efficacy of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disclosed are methods for detecting the presence of a carcinoma or an increased likelihood that a carcinoma is present in a subject. More particularly, the present invention discloses methods for diagnosis, screening, treatment and monitoring of carcinomas associated with aberrant DNA methylation of the MED15 promoter region

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As technological capabilities for capturing, aggregating, and processing large quantities of data continue to improve, the question becomes how to effectively utilise these resources. Whenever automatic methods fail, it is necessary to rely on human background knowledge, intuition, and deliberation. This creates demand for data exploration interfaces that support the analytical process, allowing users to absorb and derive knowledge from data. Such interfaces have historically been designed for experts. However, existing research has shown promise in involving a broader range of users that act as citizen scientists, placing high demands in terms of usability. Visualisation is one of the most effective analytical tools for humans to process abstract information. Our research focuses on the development of interfaces to support collaborative, community-led inquiry into data, which we refer to as Participatory Data Analytics. The development of data exploration interfaces to support independent investigations by local communities around topics of their interest presents a unique set of challenges, which we discuss in this paper. We present our preliminary work towards suitable high-level abstractions and interaction concepts to allow users to construct and tailor visualisations to their own needs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This workshop will snapshot Bourdieu's sociology. In recognition of Bourdieu's work as a powerful theoretical instrument to speculate the reproduction of social orders and cultural values, the workshop will firstly discuss the core concepts of habitus, capital, and field – the foundational triad of Bourdieu's sociology. Although Bourdieu's original work was built on some quantitative studies, his sociology has been largely qualitatively used in education research. Different from the bulk of extant research, the workshop will secondly showcase some quantitative and mixed methods research that uses a Bourdieusian framework. Mindful of such a framework helping understand social practice at a macro level, the workshop will then make an attempt to think through the macro and the micro by weaving together Bourdieu's sociology with Garfinkel's ethnomethodology. The workshop will conclude with some reflections and communications in terms of how to better realise the full value of Bourdieu in education research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The project applied analytical facilities to characterize the composition and mechanical properties of osteoporosis in maxillary bone using an ovariectomized rat model. It was found that osteoporotic jaw bone contained different amount of trace elements in comparison with the normal bone, which plays a significant role in bone quality. The knowledge generated from the study will assist the treatment of jaw bone fracture and dental implant placement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review is focused on the impact of chemometrics for resolving data sets collected from investigations of the interactions of small molecules with biopolymers. These samples have been analyzed with various instrumental techniques, such as fluorescence, ultraviolet–visible spectroscopy, and voltammetry. The impact of two powerful and demonstrably useful multivariate methods for resolution of complex data—multivariate curve resolution–alternating least squares (MCR–ALS) and parallel factor analysis (PARAFAC)—is highlighted through analysis of applications involving the interactions of small molecules with the biopolymers, serum albumin, and deoxyribonucleic acid. The outcomes illustrated that significant information extracted by the chemometric methods was unattainable by simple, univariate data analysis. In addition, although the techniques used to collect data were confined to ultraviolet–visible spectroscopy, fluorescence spectroscopy, circular dichroism, and voltammetry, data profiles produced by other techniques may also be processed. Topics considered including binding sites and modes, cooperative and competitive small molecule binding, kinetics, and thermodynamics of ligand binding, and the folding and unfolding of biopolymers. Applications of the MCR–ALS and PARAFAC methods reviewed were primarily published between 2008 and 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional differential equations are becoming increasingly used as a powerful modelling approach for understanding the many aspects of nonlocality and spatial heterogeneity. However, the numerical approximation of these models is demanding and imposes a number of computational constraints. In this paper, we introduce Fourier spectral methods as an attractive and easy-to-code alternative for the integration of fractional-in-space reaction-diffusion equations described by the fractional Laplacian in bounded rectangular domains ofRn. The main advantages of the proposed schemes is that they yield a fully diagonal representation of the fractional operator, with increased accuracy and efficiency when compared to low-order counterparts, and a completely straightforward extension to two and three spatial dimensions. Our approach is illustrated by solving several problems of practical interest, including the fractional Allen–Cahn, FitzHugh–Nagumo and Gray–Scott models, together with an analysis of the properties of these systems in terms of the fractional power of the underlying Laplacian operator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flos Chrysanthemum is a generic name for a particular group of edible plants, which also have medicinal properties. There are, in fact, twenty to thirty different cultivars, which are commonly used in beverages and for medicinal purposes. In this work, four Flos Chrysanthemum cultivars, Hangju, Taiju, Gongju, and Boju, were collected and chromatographic fingerprints were used to distinguish and assess these cultivars for quality control purposes. Chromatography fingerprints contain chemical information but also often have baseline drifts and peak shifts, which complicate data processing, and adaptive iteratively reweighted, penalized least squares, and correlation optimized warping were applied to correct the fingerprint peaks. The adjusted data were submitted to unsupervised and supervised pattern recognition methods. Principal component analysis was used to qualitatively differentiate the Flos Chrysanthemum cultivars. Partial least squares, continuum power regression, and K-nearest neighbors were used to predict the unknown samples. Finally, the elliptic joint confidence region method was used to evaluate the prediction ability of these models. The partial least squares and continuum power regression methods were shown to best represent the experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole genome sequences are generally accepted as excellent tools for studying evolutionary relationships. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignments could not be directly applied to the whole-genome comparison and phylogenomic studies. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. The “distances” used in these alignment-free methods are not proper distance metrics in the strict mathematical sense. In this study, we first review them in a more general frame — dissimilarity. Then we propose some new dissimilarities for phylogenetic analysis. Last three genome datasets are employed to evaluate these dissimilarities from a biological point of view.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Many patients presenting to the emergency department (ED) for assessment of possible acute coronary syndrome (ACS) have low cardiac troponin concentrations that change very little on repeat blood draw. It is unclear if a lack of change in cardiac troponin concentration can be used to identify acutely presenting patients at low risk of ACS. METHODS We used the hs-cTnI assay from Abbott Diagnostics, which can detect cTnI in the blood of nearly all people. We identified a population of ED patients being assessed for ACS with repeat cTnI measurement who ultimately were proven to have no acute cardiac disease at the time of presentation. We used data from the repeat sampling to calculate total within-person CV (CV(T)) and, knowing the assay analytical CV (CV(A)), we could calculate within-person biological variation (CV(i)), reference change values (RCVs), and absolute RCV delta cTnI concentrations. RESULTS We had data sets on 283 patients. Men and women had similar CV(i) values of approximately 14%, which was similar at all concentrations <40 ng/L. The biological variation was not dependent on the time interval between sample collections (t = 1.5-17 h). The absolute delta critical reference change value was similar no matter what the initial cTnI concentration was. More than 90% of subjects had a critical reference change value <5 ng/L, and 97% had values of <10 ng/L. CONCLUSIONS With this hs-cTnI assay, delta cTnI seems to be a useful tool for rapidly identifying ED patients at low risk for possible ACS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel, highly selective resonance light scattering (RLS) method was researched and developed for the analysis of phenol in different types of industrial water. An important aspect of the method involved the use of graphene quantum dots (GQDs), which were initially obtained from the pyrolysis of citric acid dissolved in aqueous solutions. The GQDs in the presence of horseradish peroxidase (HRP) and H2O2 were found to react quantitatively with phenol such that the RLS spectral band (310 nm) was quantitatively enhanced as a consequence of the interaction between the GQDs and the quinone formed in the above reaction. It was demonstrated that the novel analytical method had better selectivity and sensitivity for the determination of phenol in water as compared to other analytical methods found in the literature. Thus, trace amounts of phenol were detected over the linear ranges of 6.00×10−8–2.16×10−6 M and 2.40×10−6–2.88×10−5 M with a detection limit of 2.20×10−8 M. In addition, three different spiked waste water samples and two untreated lake water samples were analysed for phenol. Satisfactory results were obtained with the use of the novel, sensitive and rapid RLS method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the issue of finding uncertainty intervals for queries in a Bayesian Network is reconsidered. The investigation focuses on Bayesian Nets with discrete nodes and finite populations. An earlier asymptotic approach is compared with a simulation-based approach, together with further alternatives, one based on a single sample of the Bayesian Net of a particular finite population size, and another which uses expected population sizes together with exact probabilities. We conclude that a query of a Bayesian Net should be expressed as a probability embedded in an uncertainty interval. Based on an investigation of two Bayesian Net structures, the preferred method is the simulation method. However, both the single sample method and the expected sample size methods may be useful and are simpler to compute. Any method at all is more useful than none, when assessing a Bayesian Net under development, or when drawing conclusions from an ‘expert’ system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to identify current knowledge gaps in fate, exposure, and toxicity of engineered nanomaterials (ENMs), highlight research gaps, and suggest future research directions. Humans and other living organisms are exposed to ENMs during production or use of products containing them. To assess the hazards of ENMs, it is important to assess their physiochemical properties and try to relate them to any observed hazard. However, the full determination of these relationships is currently limited by the lack of empirical data. Moreover, most toxicity studies do not use realistic environmental exposure conditions for determining dose-response parameters, affecting the accurate estimation of health risks associated with the exposure to ENMs. Regulatory aspects of nanotechnology are still developing and are currently the subject of much debate. Synthesis of available studies suggests a number of open questions. These include (i) developing a combination of different analytical methods for determining ENM concentration, size, shape, surface properties, and morphology in different environmental media, (ii) conducting toxicity studies using environmentally relevant exposure conditions and obtaining data relevant to developing quantitative nanostructure-toxicity relationships (QNTR), and (iii) developing guidelines for regulating exposure of ENMs in the environment.