915 resultados para random search algorithms
Resumo:
Background Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in cancer survival. Multilevel models assume independent geographical areas, whereas spatial models explicitly incorporate geographical correlation, often via a conditional autoregressive prior. However the relative merits of these methods for large population-based studies have not been explored. Using a case-study approach, we report on the implications of using multilevel and spatial survival models to study geographical inequalities in all-cause survival. Methods Multilevel discrete-time and Bayesian spatial survival models were used to study geographical inequalities in all-cause survival for a population-based colorectal cancer cohort of 22,727 cases aged 20–84 years diagnosed during 1997–2007 from Queensland, Australia. Results Both approaches were viable on this large dataset, and produced similar estimates of the fixed effects. After adding area-level covariates, the between-area variability in survival using multilevel discrete-time models was no longer significant. Spatial inequalities in survival were also markedly reduced after adjusting for aggregated area-level covariates. Only the multilevel approach however, provided an estimation of the contribution of geographical variation to the total variation in survival between individual patients. Conclusions With little difference observed between the two approaches in the estimation of fixed effects, multilevel models should be favored if there is a clear hierarchical data structure and measuring the independent impact of individual- and area-level effects on survival differences is of primary interest. Bayesian spatial analyses may be preferred if spatial correlation between areas is important and if the priority is to assess small-area variations in survival and map spatial patterns. Both approaches can be readily fitted to geographically enabled survival data from international settings
Resumo:
In continuum one-dimensional space, a coupled directed continuous time random walk model is proposed, where the random walker jumps toward one direction and the waiting time between jumps affects the subsequent jump. In the proposed model, the Laplace-Laplace transform of the probability density function P(x,t) of finding the walker at position at time is completely determined by the Laplace transform of the probability density function φ(t) of the waiting time. In terms of the probability density function of the waiting time in the Laplace domain, the limit distribution of the random process and the corresponding evolving equations are derived.
Resumo:
Information available on company websites can help people navigate to the offices of groups and individuals within the company. Automatically retrieving this within-organisation spatial information is a challenging AI problem This paper introduces a novel unsupervised pattern-based method to extract within-organisation spatial information by taking advantage of HTML structure patterns, together with a novel Conditional Random Fields (CRF) based method to identify different categories of within-organisation spatial information. The results show that the proposed method can achieve a high performance in terms of F-Score, indicating that this purely syntactic method based on web search and an analysis of HTML structure is well-suited for retrieving within-organisation spatial information.
Resumo:
In order to understand the role of translational modes in the orientational relaxation in dense dipolar liquids, we have carried out a computer ''experiment'' where a random dipolar lattice was generated by quenching only the translational motion of the molecules of an equilibrated dipolar liquid. The lattice so generated was orientationally disordered and positionally random. The detailed study of orientational relaxation in this random dipolar lattice revealed interesting differences from those of the corresponding dipolar liquid. In particular, we found that the relaxation of the collective orientational correlation functions at the intermediate wave numbers was markedly slower at the long times for the random lattice than that of the liquid. This verified the important role of the translational modes in this regime, as predicted recently by the molecular theories. The single-particle orientational correlation functions of the random lattice also decayed significantly slowly at long times, compared to those of the dipolar liquid.
Resumo:
The growing interest in co-created reading experiences in both digital and print formats raises interesting questions for creative writers who work in the space of interactive fiction. This essay argues that writers have not abandoned experiments with co-creation in print narratives in favour of the attractions of the digital environment, as might be assumed by the discourse on digital development. Rather, interactive print narratives, in particular ‘reader-assembled narratives’ demonstrate a rich history of experimentation and continue to engage writers who wish to craft individual reading experiences for readers and to experiment with their own creative process as writers. The reader-assembled narrative has been used for many different reasons and for some writers, such as BS Johnson it is a method of problem solving, for others, like Robert Coover, it is a way to engage the reader in a more playful sense. Authors such as Marc Saporta, BS Johnson, and Robert Coover have engaged with this type of narrative play. This examination considers the narrative experimentation of these authors as a way of offering insights into creative practice for contemporary creative writers.
Resumo:
Recently, efficient scheduling algorithms based on Lagrangian relaxation have been proposed for scheduling parallel machine systems and job shops. In this article, we develop real-world extensions to these scheduling methods. In the first part of the paper, we consider the problem of scheduling single operation jobs on parallel identical machines and extend the methodology to handle multiple classes of jobs, taking into account setup times and setup costs, The proposed methodology uses Lagrangian relaxation and simulated annealing in a hybrid framework, In the second part of the paper, we consider a Lagrangian relaxation based method for scheduling job shops and extend it to obtain a scheduling methodology for a real-world flexible manufacturing system with centralized material handling.
Resumo:
Part I (Manjunath et al., 1994, Chem. Engng Sci. 49, 1451-1463) of this paper showed that the random particle numbers and size distributions in precipitation processes in very small drops obtained by stochastic simulation techniques deviate substantially from the predictions of conventional population balance. The foregoing problem is considered in this paper in terms of a mean field approximation obtained by applying a first-order closure to an unclosed set of mean field equations presented in Part I. The mean field approximation consists of two mutually coupled partial differential equations featuring (i) the probability distribution for residual supersaturation and (ii) the mean number density of particles for each size and supersaturation from which all average properties and fluctuations can be calculated. The mean field equations have been solved by finite difference methods for (i) crystallization and (ii) precipitation of a metal hydroxide both occurring in a single drop of specified initial supersaturation. The results for the average number of particles, average residual supersaturation, the average size distribution, and fluctuations about the average values have been compared with those obtained by stochastic simulation techniques and by population balance. This comparison shows that the mean field predictions are substantially superior to those of population balance as judged by the close proximity of results from the former to those from stochastic simulations. The agreement is excellent for broad initial supersaturations at short times but deteriorates progressively at larger times. For steep initial supersaturation distributions, predictions of the mean field theory are not satisfactory thus calling for higher-order approximations. The merit of the mean field approximation over stochastic simulation lies in its potential to reduce expensive computation times involved in simulation. More effective computational techniques could not only enhance this advantage of the mean field approximation but also make it possible to use higher-order approximations eliminating the constraints under which the stochastic dynamics of the process can be predicted accurately.
Resumo:
Japan is in the midst of massive law reform. Mired in ongoing recession since the early 1990s, Japan has been implementing a new regulatory blueprint to kickstart a sluggish economy through structural change. A key element to this reform process is a rethink of corporate governance and its stakeholder relations. With a patchwork of legislative initiatives in areas as diverse as corporate law, finance, labour relations, consumer protection, public administration and civil justice, this new model is beginning to take shape. But to what extent does this model represent a break from the past? Some commentators are breathlessly predicting the "Americanisation" of Japanese law. They see the triumph of Western-style capitalism - the "End of History", to borrow the words of Francis Fukuyama - with its emphasis on market-based, arms-length transactions. Others are more cautious, advancing the view that there new reforms are merely "creative twists" on what is a uniquely (although slowly evolving) strand of Japanese capitalism. This paper takes issue with both interpretations. It argues that the new reforms merely follow Japan's long tradition of 'adopting and adapting' foreign models to suit domestic purposes. They are neither the wholesale importation of "Anglo-Saxon" regulatory principles nor a thin veneer over a 'uniquely unique' form of Confucian cultural capitalism. Rather, they represent a specific and largely political solution (conservative reformism) to a current economic problem (recession). The larger themes of this paper are 'change' and 'continuity'. 'Change' suggests evolution to something identifiable; 'continuity' suggests adhering to an existing state of affairs. Although notionally opposites, 'change' and 'continuity' have something in common - they both suggest some form of predictability and coherence in regulatory reform. Our paper, by contrast, submits that Japanese corporate governance reform or, indeed, law reform more generally in Japan, is context-specific, multi-layered (with different dimensions not necessarily pulling all in the same direction for example, in relations with key outside suppliers), and therefore more random or 'chaotic'.
Resumo:
The mode of action of xylanase and beta-glucosidase purified from the culture filtrate of Humicola lanuginosa (Griffon and Maublanc) Bunce on the xylan extracted from sugarcane bagasse and on two commercially available larchwood and oat spelt xylans, on xylooligomers and on arabinoxylooligomers was studied. While larchwood and oat spelt xylans were hydrolyzed to the same extent in 24 h, sugarcane bagasse xylan was hydrolyzed to a lesser extent in the same period. It was found that the rate of hydrolysis of xylooligomers by xylanase increased with increase in chain length, while beta-glucosidase acted rather slowly on all the oligomers tested. Xylanase exhibited predominant ''endo'' action on xylooligomers attacking the xylan chain at random while beta-glucosidase had ''exo'' action, releasing one xylose residue at a time. On arabinoxylooligomers, however, xylanase exhibited ''exo'' action. Thus, it appears that the presence of the arabinose substituent has, in some way, rendered the terminal xylose-xylose linkage more susceptible to xylanase action. It was also observed that even after extensive hydrolysis with both the enzymes, substantial amounts of the parent arabinoxylooligomer remained unhydrolyzed together with the accumulation of arabinoxylobiose. It can therefore be concluded that the presence of the arabinose substituent in the xylan chain results in linkages that offer resistance to both xylanase and beta-glucosidase action.
Resumo:
We study by means of experiments and Monte Carlo simulations, the scattering of light in random media, to determine the distance up to which photons travel along almost undeviated paths within a scattering medium, and are therefore capable of casting a shadow of an opaque inclusion embedded within the medium. Such photons are isolated by polarisation discrimination wherein the plane of linear polarisation of the input light is continuously rotated and the polarisation preserving component of the emerging light is extracted by means of a Fourier transform. This technique is a software implementation of lock-in detection. We find that images may be recovered to a depth far in excess of that predicted by the diffusion theory of photon propagation. To understand our experimental results, we perform Monte Carlo simulations to model the random walk behaviour of the multiply scattered photons. We present a. new definition of a diffusing photon in terms of the memory of its initial direction of propagation, which we then quantify in terms of an angular correlation function. This redefinition yields the penetration depth of the polarisation preserving photons. Based on these results, we have formulated a model to understand shadow formation in a turbid medium, the predictions of which are in good agreement with our experimental results.
Resumo:
A beam-column resting on continuous Winkler foundation and discrete elastic supports is considered. The beam-column is of variable cross-section and the variation of sectional properties along the axis of the beam-column is deterministic. Young's modulus, mass per unit length and distributed axial loadings of the beam-column have a stochastic distribution. The foundation stiffness coefficient of the Winkler model, the stiffnesses of discrete elastic supports, stiffnesses of end springs and the end thrust, are all considered as random parameters. The material property fluctuations and distributed axial loadings are considered to constitute independent, one-dimension uni-variate homogeneous real stochastic fields in space. The foundation stiffness coefficient, stiffnesses of the discrete elastic supports, stiffnesses of end springs and the end thrust are considered to constitute independent random variables. Static response, free vibration and stability behaviour of the beam-column are studied. Hamilton's principle is used to formulate the problem using stochastic FEM. Sensitivity vectors of the response and stability parameters are evaluated. Using these statistics of free vibration frequencies, mode shapes, buckling parameters, etc., are evaluated. A numerical example is given.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
In a search for inorganic oxide materials showing second-order nonlinear optical (NLO) susceptibility, we investigated several berates, silicates, and a phosphate containing trans-connected MO6, octahedral chains or MO5 square pyramids, where, M = d(0): Ti(IV), Nb(V), or Ta(V), Our investigations identified two new NLO structures: batisite, Na2Ba(TiO)(2)Si4O12, containing trans-connected TiO5 octahedral chains, and fresnoite, Ba2TiOSi2O7, containing square-pyramidal TiO5. Investigation of two other materials containing square-pyramidal TiO5 viz,, Cs2TiOP2O7 and Na4Ti2Si8O22. 4H(2)O, revealed that isolated TiO5, square pyramids alone do not cause a second harmonic generation (SHG) response; rather, the orientation of TiO5 units to produce -Ti-O-Ti-O- chains with alternating long and short Ti-O distances in the fresnoite structure is most likely the origin of a strong SHG response in fresnoite,
Resumo:
Internal motions in a A2BX4 compound (tetramethylammonium tetrabromo cadmate) have been investigated using proton spin—lattice relaxation time (T1) and second moment (M2) measurements in the temperature range 77 to 400 K. T1 measurements at three Larmor frequencies (10, 20 and 30 MHz) show isotropic tumbling of the tetramethylammonium group, random reorientation of methyl groups and spin—rotation interaction, and the corresponding parameters have been computed. The cw spectrum is narrow throughout the temperature range and shows side bands at the lowest temperature. This observation, along with the free-induction-decay behavior at these temperatures, is interpreted as the onset of a coherent motion, e.g. methyl group quantum tunnelling.
Resumo:
This paper proposes new metrics and a performance-assessment framework for vision-based weed and fruit detection and classification algorithms. In order to compare algorithms, and make a decision on which one to use fora particular application, it is necessary to take into account that the performance obtained in a series of tests is subject to uncertainty. Such characterisation of uncertainty seems not to be captured by the performance metrics currently reported in the literature. Therefore, we pose the problem as a general problem of scientific inference, which arises out of incomplete information, and propose as a metric of performance the(posterior) predictive probabilities that the algorithms will provide a correct outcome for target and background detection. We detail the framework through which these predicted probabilities can be obtained, which is Bayesian in nature. As an illustration example, we apply the framework to the assessment of performance of four algorithms that could potentially be used in the detection of capsicums (peppers).